Readit News logoReadit News
EricBurnett · 3 years ago
I've long been enamored with the idea of learning from analog computers to build the next generation of digital ones. In some perspective all our computers are analog, of a sort - today's computer chips are effectively leveraging electron flow through a carefully arranged metal/silicon substrate, with self-interference via electromagnetic fields used to construct transistors and build up higher order logic units. We're now working on photonic computers, presumably with some new property leading to self interference, and allowing transistors/logic above that.

"Wires" are a useful convenience in the electron world, to build pathways that don't degrade with the passing of the elections themselves. But if we relax that constraint a bit, are there other ways we can build up arrangements of "organized flow" sufficient to have logic units arise? E.g. imagine pressure waves in a fluid -filled container, with mini barriers throughout defining the possible flow arrangement that allows for interesting self-reflections. Or way further out, could we use gravitational waves through some dense substance with carefully arranged holes, self-interfering via their effect on space-time, to do computations for us? And maybe before we get there, is there a way we could capitalize on the strong or weak nuclear force to "arrange" higher frequency logical computations to happen?

Physics permits all sorts of interactions, and we only really use the simple/easy-to-conceptualize ones as yet, which I hope and believe leaves lots more for us to grow into yet :).

sandworm101 · 3 years ago
Electricity is also a wave. The wires are essentially waveguides for particles/waves traveling at near luminal speeds. So in theory anything done with electricity could be replicated using other waves, but to make it faster you would need waves that travel faster than electrons through a wire. Photons through a vacuum might be marginally faster, but pressure waves though a fluid would not.

If bitflips are a problem in a modern chip, imagine the number of problems if your computer ran on gravity waves. The background hum of billions of star collisions cannot be blocked out with grounded tinfoil. There is no concept of a faraday cage for gravity waves.

stochtastic · 3 years ago
Nitpick: gravity waves [1] pretty universally refer to waves in fluid media in which the restoring force is buoyancy. Ripples in spacetime are usually called _gravitational_ waves.

[1] https://en.wikipedia.org/wiki/Gravity_wave

[2] https://en.wikipedia.org/wiki/Gravitational_wave

EricBurnett · 3 years ago
You're right that the speed of light remains a constant limitation on propagation delay, but the defining limitation on the speed of computation is rather the clock speed - how long it takes for each round of computation. Electrons are comparatively slow due to the time it takes to fill and stabilize a transistor. Our hypothetical new type of computer will have to be faster to converge, rather than faster to propagate.

You're right about the bit flips though. I don't know if a gravitational wave computer is actually ever going to be feasible, just an interesting dream for the far future. Hopefully there are more options to consider in the meantime :).

lupire · 3 years ago
Gravity is a poor source of computation because it is incredibly weak - 10^-43 vs electron force. Even if you add several powers of 10 for all the metal wire harness and battery chemistry around the electrons, you still get far more usable force per gram from electricity and metal than you do from gravity.
markisus · 3 years ago
Is it even theoretically possible to waveguide gravity? The electric field can be positive and negative, but gravity is unsigned -- there is no anti-gravity. This is probably related to what you're saying about faraday cages.
ethn · 3 years ago
Electricity travels faster than the speed of electrons (which only travel at ~3 cm/s!), it travels proportional to the speed of light, it’s speed is instead described by the Poynting vector, an energy wave.
DoingIsLearning · 3 years ago
> So in theory anything done with electricity could be replicated using other waves

I sort of get this in a discrete digital logic scenario but out of curiosity as someone not big on Photonics, what would be the light 'equivalent' of an electrical AC signal? I'm kind of struggling to visual that.

altruios · 3 years ago
A faraday cage for gravity waves would be awesome... I mean - computers are nice - but you hit the nail on the head for revolutionary tech.
323 · 3 years ago
> It employs two-dimensional quasiparticles called anyons, whose world lines pass around one another to form braids in a three-dimensional spacetime (i.e., one temporal plus two spatial dimensions). These braids form the logic gates that make up the computer. The advantage of a quantum computer based on quantum braids over using trapped quantum particles is that the former is much more stable.

https://en.wikipedia.org/wiki/Topological_quantum_computer

Deleted Comment

huachimingo · 3 years ago
Its like procedural generation: hide the data into a formula/algorithm, so it makes less space.

Replace "data" with "computation", and "formula" with physical, less expensive processes.

dhon_ · 3 years ago
This reminds me of the method of calculating Fourier transform by refracting light through a prism and reading off the different frequencies. You get the "calculation" for free.
Enginerrrd · 3 years ago
Analog computers are pretty awesome!

Say you take a standard slide rule with two log scales, and want to do a division problem, x/y. There's more than one way to do it. I can think of at least 3. One of them won't just compute x/y for your particular x, but will compute x/y for ANY x.

Accuracy is always the issue with analog stuff, but they sure are neat.

Another fun one to contemplate is spaghetti sort. With an analog computer of sufficient resolution, you can sort n elements in O(n). You represent the numbers being sorted by lengths of spaghetti. Then you put them on the table straight up and bring a flat object down until it hits the first and largest piece of spaghetti. You set that down and repeat the process, selecting the largest element of the set every time.

I've always liked the idea of hybrid systems. I envision one where you feed the analog part of your problem with a DAC, then get a really close answer up to the limit of your precision from the analog component, then pass that back out to an ADC and you have a very very close guess to feed into a digital algorithm to clean up the precision a bit. I bet you could absolutely fly through matrix multiplication that way. You could also take the analog output and adjust the scale so it's where it needs to be on the ambiguous parts, then feed it back into your analog computer again to refine your results.

hbarka · 3 years ago
Where does a doctor’s stethoscope fit in? Other examples: Mechanic’s stethoscope for diagnosing an engine, airplane vibrations to foretell maintenance, bump oscillations to grade quality of a roadway.
TedDoesntTalk · 3 years ago
> spaghetti sort

Isn’t this how very old sorting machines with punch cards worked? I’m thinking of the kinds used by the census or voting machines in the late 1800s or early 1900s.

ulnarkressty · 3 years ago
An even better one - holding an image at the focal point of a lens produces its Fourier transform at the focal point on the other side of the lens[0]. It is used for "analog" pattern matching[1]. There is an interesting video explaining this on the Huygen Optics Youtube channel[2].

[0] - https://en.wikipedia.org/wiki/Fourier_optics

[1] - https://en.wikipedia.org/wiki/Optical_correlator

[2] - https://www.youtube.com/watch?v=Y9FZ4igNxNA

dhon_ · 3 years ago
Fascinating, thank you for sharing!
teshier-A · 3 years ago
Surprised to see no mention of LightOn and its Optical Processing Unit !
nurettin · 3 years ago
This is solarpunk material.
sshlocalhost98 · 3 years ago
This is such an interesting way of computation. I watched analog way of computing neural networks by veritasium.
SilasX · 3 years ago
Like how mirrors "compute" the (appropriately defined) reverse of an image?
alliao · 3 years ago
oh god, I can see it coming. elaborated analogue music player for a special price. it's using nothing but light. the fuzzy output will be it's feature; sought after by misdirected audiophiles...
stackbutterflow · 3 years ago
Is it calculation or simulation?
Banana699 · 3 years ago
Not much difference here, Calculation (or, more generally, Computation) is the manipulation of abstract symbols according to pure rules that may or may not represent concrete entities, e.g. the simplification of polynomials according to the rule of adding like powers.

Simulation is when we manipulate things (concrete or abstract) according to the rules that govern other concrete things, e.g. pushing around balls in circles to (highly inaccurately) represent the orbit of planets around a star.

Not all calculation is simulation, and not all simulation is calculation, but there exists an intersection of both.

The key trick you can do with that last category is that when the physical system you're simulating is controllable enough, you can use the correspondence in the other direction: Use the concrete things to simulate the abstract things. It's simulation, because you're manipulating concrete entities according to the rules that govern other entities (who happen to be abstract),but what you're doing also amounts to doing a calculation with those abstract entities.

Deleted Comment

V-2 · 3 years ago
This perspective fits nicely with the simulation theory.

If we accept it, for argument's sake, then what's happening is essentially delegating the computation to the ultra-meta-computer that runs the simulation.

arrow7000 · 3 years ago
It also fits nicely with the universe just being mathematically consistent
andrewflnr · 3 years ago
There is nothing that doesn't fit nicely with simulation theory. It's an epistemological tarpit.
Syzygies · 3 years ago
Whether the universe is a simulation is unknowable, but the universe could consist of thought. If so, this research is dangerous; like the Trinity nuclear test, the conflagration could alter our neighborhood of the universe.

I had a pretty convincing revelation last night that the simulation was run by insects. I could only get back to sleep by ridiculing myself for such a derivative thought. Or is there a reason it's universal?

Deleted Comment

pseudolus · 3 years ago
I believe one of the earliest applications incorporating this line of thought was MONIAC, the Monetary National Income Analogue Computer, which used water levels to model the economy [0]. There's a short youtube documentary on its history and operation. [1]

[0] https://en.wikipedia.org/wiki/MONIAC

[1] https://www.youtube.com/watch?v=rAZavOcEnLg&ab_channel=Reser...

https://youtu.be/rAZavOcEnLg?t=101 (shows operation of MONIAC)

wardedVibe · 3 years ago
Analog computers are from the 19th century; they were used to decompose signals using the Fourier transform, since it's easy(ish) to get a bunch of different frequency oscillators. They used them for tides and differential equations. https://en.m.wikipedia.org/wiki/Analog_computer
lisper · 3 years ago
A better title would have been: how to make the universe do (a whole lot of) math for us [1]. What so-called neural networks do should not be confused with thinking, at least not yet.

And the fact that we can get the universe to do math for us should not be surprising: we can model the universe with math, so of course that mapping works in the other direction as well. And this is not news. There were analog computers long before there were digital ones.

---

[1] ... using surprisingly small amounts of hardware relative to what a digital computer would require for certain kinds of computations that turn out to be kind of interesting and useful in specific domains. But that's not nearly as catchy as the original.

misja111 · 3 years ago
> we can model the universe with math, so of course that mapping works in the other direction as well.

This is not so obvious as you make it appear. For instance, we can model the weather for the next couple of days using math. But letting the weather of the next couple of days calculate math for us doesn't work very well. The reason is that we can't set the inputs for the weather.

This problem comes up in various forms and shapes in other 'nature computers' as well. Quantum computers are another example where the model works brilliantly but setting the pre- and side conditions in the real world is a major headache.

lisper · 3 years ago
I didn’t mean to imply that implementing it should be easy. Only that it should be unsurprising that it is possible.
goldenkey · 3 years ago
You can use the weather or a bucket of water or well, any sufficiently complex chaotic system, as a reservoir computer though:

https://en.wikipedia.org/wiki/Reservoir_computing

zmgsabst · 3 years ago
> What so-called neural networks do should not be confused with thinking, at least not yet.

I disagree:

I think neural networks are learning an internal language in which they reason about decisions, based on the data they’ve seen.

I think tensor DAGs correspond to an implicit model for some language, and we just lack the tools to extract that. We can translate reasoning in a type theory into a tensor DAG, so I’m not sure why people object to that mapping working the other direction as well.

V__ · 3 years ago
This internal language, if I'm not mistaken, is exactly what the encoder and decoder parts of the neural networks do.

> in which they reason about decisions

I'm in awe of what the latest neural networks can produce, but I'm wary to call it “reasoning” or “deciding”. NNs are just very complex math equations and calling this intelligence is, in my opinion, muddying the waters of how far away we are from actual AI.

troyvit · 3 years ago
We can model the universe with math because math is what we have to model the universe with. The fact that it can talk back to us in math is amazing because to me it means that math is not a dead end cosmically, which means we might be able to use it to communicate with other intelligences after all.
hans1729 · 3 years ago
assertion: thinking is synonymous with computation (composed operations on symbolic systems).

computation is boolean algebra.

-> therefore, doing math is to think.

I'm not trying to be pedantic, I just don't think using intuitive associations with words helps clarifying things. If your definition for thought diverges here, please try to specify how exactly: what is thought, then? Semi-autonomous "pondering"? Because the closer I look at it, that, too, becomes boolean algebra, calling eval() on some semantic construct, which boils down to symbolic logic.

What you may mean is that "neural" networks are performing statistics instead of algebra, but that's not what the article is about, is it?

sweetdreamerit · 3 years ago
> I don't think using intuitive associations with words helps clarify things Sincere question: do you think that "think using intuitive associations with words" can be safely translated to "compute using intuitive associations with words"? I don't think so. Therefore, even if thinking is also computing, reducing thinking to boolean algebra is a form of reductionism that ignores a number of emergent properties of (human) thinking.
meroes · 3 years ago
Is a ruler and compass computation? They don’t operate symbolically and are computers.
mannykannot · 3 years ago
> If your definition for thought diverges here, please try to specify how exactly: what is thought, then?

This is a burden-shifting reply of "so prove me wrong!" to anyone who feels that your assertion lacks sufficient justification for it to be taken as an axiom.

mensetmanusman · 3 years ago
Humans are the universe thinking for itself.
jb1991 · 3 years ago
Indeed, it is said that "life is the universe's way of looking back at itself."
ben_w · 3 years ago
This is more like “Surprise! Turns out panpsychism was the right answer all along!”
hprotagonist · 3 years ago
[[ liebniz chuckling in the background ]]
lolive · 3 years ago
Ok. But then:

- what is the question?

- what is the answer?

bobsmooth · 3 years ago
Sure, but I'd prefer a computer without self-doubt.
the_other · 3 years ago
Some self doubt is critical for right thinking.
willis936 · 3 years ago
A system without introspection would never self-improve.
spideymans · 3 years ago
Then perhaps the best way to make the universe think for us is to produce a biological computer, similar in nature to a brain.
guerrilla · 3 years ago
So... slaves?
guerrilla · 3 years ago
Well, all animals are...
rbn3 · 3 years ago
This instantly reminded me of the paper "pattern recognition in a bucket"[0], which I've seen referenced a lot when I first started reading about AI in general. I only have surface-level knowledge about the field, but how exactly does what's described in the article differ from reservoir computing? (The article doesn't mention that term, so I assume there must be a difference)

[0] https://www.researchgate.net/publication/221531443_Pattern_R...

grahamrow · 3 years ago
In this PNN approach you are solving for what additional stimuli, when applied to the system alongside the inputs, produce the desired result for a given input. In reservoir computing (RC) you don’t bother to provide any additional stimuli, and find the linear combination of reservoir outputs that gives the desired result. Training the former is more demanding and analogous to a NN (thus the name), but directly produces your answer from the system. The latter is very easy to train (one regression) but requires post processing for inference.
rbn3 · 3 years ago
aaah - got it, thanks!
willhinsa · 3 years ago
The universe is already thinking for itself! It wrote this comment and built this website, after all.
tabtab · 3 years ago
And trying to expel humans after seeing them in action.
discreteevent · 3 years ago
When I first came across machine learning it reminded me of control theory. And sure enough if you search around you get to articles like this [1] saying that neural networks were very much inspired by control theory. The bit of control theory that I was taught way back was about analog systems. I have no idea if the electronic circuit mentioned at the end is even like a classical control system but it does feel a bit like something coming around full circle.

[1] https://scriptedonachip.com/ml-control