Readit News logoReadit News
d_silin · 2 months ago
It is a hardware RNG they are building. The claim is that their solution is going to be more computationally efficient for a narrow class of problems (de-noising step for diffusion AI models) vs current state of the art. Maybe.

This is what they are trying to create, more specifically:

https://pubs.aip.org/aip/apl/article/119/15/150503/40486/Pro...

modeless · 2 months ago
It's not just a "hardware RNG". An RNG outputs a uniform distribution. This hardware outputs randomness with controllable distributions, potentially extremely complex ones, many orders of magnitude more efficiently than doing it the traditional way with ALUs. The class of problems that can be solved by sampling from extremely complex probability distributions is much larger than you might naively expect.

I was skeptical of Extropic from the start, but what they've shown here exceeded my low expectations. They've made real hardware which is novel and potentially useful in the future after a lot more R&D. Analog computing implemented in existing CMOS processes that can run AI more efficiently by four orders of magnitude would certainly be revolutionary. That final outcome seems far enough away that this should probably still be the domain of university research labs rather than a venture-backed startup, but I still applaud the effort and wish them luck.

bfkwlfkjf · a month ago
> The class of problems that can be solved by sampling from extremely complex probability distributions is much larger than you might naively expect.

Could you provide some keywords to read more about this?

A_D_E_P_T · 2 months ago
An old concept indeed! I think about this Ed Fredkin story a lot... In his words:

"Just a funny story about random numbers: in the early days of computers people wanted to have random numbers for Monte Carlo simulations and stuff like that and so a great big wonderful computer was being designed at MIT’s Lincoln laboratory. It was the largest fastest computer in the world called TX2 and was to have every bell and whistle possible: a display screen that was very fancy and stuff like that. And they decided they were going to solve the random number problem, so they included a register that always yielded a random number; this was really done carefully with radioactive material and Geiger counters, and so on. And so whenever you read this register you got a truly random number, and they thought: “This is a great advance in random numbers for computers!” But the experience was contrary to their expectations! Which was that it turned into a great disaster and everyone ended up hating it: no one writing a program could debug it, because it never ran the same way twice, so ... This was a bit of an exaggeration, but as a result everybody decided that the random number generators of the traditional kind, i.e., shift register sequence generated type and so on, were much better. So that idea got abandoned, and I don’t think it has ever reappeared."

RIP Ed. https://en.wikipedia.org/wiki/Edward_Fredkin

Imnimo · 2 months ago
And still today we spend a great deal of effort trying to make our randomly-sampled LLM outputs reproducibly deterministic:

https://thinkingmachines.ai/blog/defeating-nondeterminism-in...

rcxdude · 2 months ago
It's funny because that did actually reappear at some point with rdrand. But still it's only really used for cryptography, if you just need a random distribution almost everyone just uses a PRNG (a non-cryptographic one is a lot faster still, apart from being deterministic).
vlovich123 · 2 months ago
Generating randomness is not a bottleneck and modern SIMD CPUs should be more than fast enough. I thought they’re building approximate computation where a*b is computed within some error threshold p.
UltraSane · 2 months ago
Generating enough random numbers with the right distribution for Gibbs sampling, at incredibly low power is what their hardware does.
jazzyjackson · 2 months ago
I think that's underselling it a bit, since there's lots of existing ways to have A hardware RNG. They're trying to use lots and lots of hardware RNG to solve probabilistic problems a little more probabilisticly.
pclmulqdq · 2 months ago
I tried this, but not with the "AI magic" angle. It turns out nobody cares because CSPRNGs are random enough and really fast.
adrian_b · 2 months ago
The article linked by you uses magnetic tunnel junctions for implementing the RNG part.

The Web site of Extropic claims that their hardware devices are made with standard CMOS technology, which cannot make magnetic tunnel junctions.

So it appears that there is no connection between the linked article and what Extropic does.

The idea of stochastic computation is not at all new. I have read about such stochastic computers as a young child, more than a half of century ago, long before personal computers. The research on them was inspired by the hypotheses about how the brain might work.

Along with analog computers, stochastic computers were abandoned due to the fast progress of deterministic digital computers, implemented with logic integrated circuits.

So anything new cannot be about the structure of stochastic computers, which has been well understood for decades, but only about a novel extremely compact hardware RNG device, which could be scaled to a huge number of RNG devices per stochastic computer.

I could not find during a brief browsing of the Extropic site any description about the principle of their hardware RNG, except that it is made with standard CMOS technology. While there are plenty of devices made in standard CMOS that can be used as RNG, they are not reliable enough for stochastic computation (unless you use complex compensation circuits), so Extropic must have found some neat trick to avoid using complex circuitry, assuming that their claims are correct.

However I am skeptical about their claims because of the amount of BS words used on their pages, which look like taken from pseudo-scientific Star Trek-like mumbo-jumbo, e.g. "thermodynamic computing", "accelerated intelligence", "Extropic" derived from "entropic", and so on.

To be more clear, there is no such thing as "thermodynamic computing" and inventing such meaningless word combinations is insulting for the potential customers, as it demonstrates that the Extropic management believes that they must be naive morons.

The traditional term for such computing is "stochastic computing". "Stochastics" is an older, and in my opinion better, alternative name for the theory of probabilities. In Ancient Greek, "stochastics" means the science of guessing. Instead of "stochastic computing" one can say "probabilistic computing", but not "thermodynamic computing", which makes no sense (unless the Extropic computers are dual use, besides computing, they also provide heating and hot water for a great number of houses!).

Like analog computers, stochastic computers are a good choice only for low-precision computations. With increased precision, the amount of required hardware increases much faster for analog computers and for stochastic computers than for deterministic digital computers.

The only currently important application that is happy even with precisions under 16 bit is AI/ML, so trying to market their product for AI applications is normal for Extropic, but they should provide more meaningful information about what advantages their product might have.

trevor_extropic · 2 months ago
If you want to understand exactly what we are building, read our blogs and then our paper

https://extropic.ai/writinghttps://arxiv.org/abs/2510.23972

throwaway_7274 · 2 months ago
I was hoping the preprint would explain the mysterious ancient runes on the device chassis :(
helicone · 2 months ago
i dig it.

people are so scared of losing market share because of art choice they make all of their products smooth dark grey rectangles with no features.

ugly.

at least this one has some sense of beauty, the courage to make a decision about what looks good to them and act on it. they'll probably have to change the heptagon shape because no way that becomes a standard

it costs so little to add artistic flair to a product, its really a shame fewer companies do

ehnto · 2 months ago
It looks super cool. I feel like I'm watching cyberpunk come to life with the way we're talking about technology these days, but this also looks straight out of the Neuromancer of my imagination.
ipsum2 · 2 months ago
The answer is that they're cosplaying sci-fi movies, in attempt to woo investors.
chermi · 2 months ago
Could you explain to me how you could reasonably justify not citing even one of Normal Computing's works? I can't imagine you're unaware of them or their works. You cite the Thermodynamic Computing group paper.

Deleted Comment

Deleted Comment

helicone · 2 months ago
can you play doom on it, yet?
nfw2 · 2 months ago
I don't really understand the purpose of hyping up a launch announcement and then not making any effort whatsoever to make the progress comprehensible to anyone without advanced expertise in the field.
ipsum2 · 2 months ago
That's the intention. Fill it up with enough jargon and gobbledegook that it looks impressive to investors, while hiding the fact that there's no real technology underneath.
fastball · 2 months ago
You not comprehending a technology does not automatically make it vaporware.
frozenseven · 2 months ago
>jargon and gobbledegook

>no real technology underneath

They're literally shipping real hardware. They also put out a paper + posted their code too.

Flippant insults will not cut it.

maradan · 2 months ago
"no really technology underneath" zzzzzzzzzzz
lacy_tinpot · 2 months ago
What's not comprehensible?

It's just miniaturized lava lamps.

nfw2 · 2 months ago
A lava lamps that just produces randomness, ie for cryptology purposes, is different than the benefit here, which is to produce specific randomness at low energy-cost
hirako2000 · a month ago
It proves the point it mesmerizes the audience.

One can create a true random generator algorithm by plugging a moving computer mouse to its input.

Would be easy to put a dozen cages with mouse wheels on in them, real mammals in there, to generate a lot of random numbers, everyone would understand so only funny, they want mysterious!

alyxya · 2 months ago
This seems to be the page that describes the low level details of what the hardware aims to do. https://extropic.ai/writing/tsu-101-an-entirely-new-type-of-...

To me, the biggest limitation is that you’d need an entirely new stack to support a new paradigm. It doesn’t seem compatible with using existing pretrained models. There’s plenty of ways to have much more efficient paradigms of computation, but it’ll be a long while before any are mature enough to show substantial value.

vlovich123 · 2 months ago
I’ve been wondering how long it would take for someone to try probabilistic computing for AI workloads - the imprecision inherent in the workload makes it ideally suited for AI matrix math with a significant power reduction. My professor in university was researching this space and it seemed very interesting. I never thought it could supplant CPUs necessarily but certainly massive computer applications that don’t require precise math like 3D rendering (and now AI) always seemed like a natural fit.
Imustaskforhelp · 2 months ago
I don't think that it does AI matrix math with significant power reduction but rather it just seems to provide rng? I may be wrong but I don't think what you are saying is true in my limited knowledge, maybe someone can tell what is the reality of it, whether it can do Ai matrix math with significant power reduction or not or if its even their goal right now as to me currently it feels like a lava lamp equivalent* thing as some other commenter said
rcxdude · 2 months ago
The paper talks about some quite old-school AI techniques (the kind of thing I learned about in university a decade ago when it was already on its way out). It's not anything to do with matrix multiplications (well, anything do with computing them faster directly) but instead being able to sample from a complex distribution more efficiently by have dedicated circuits to simulate elements of that distribution in hardware. So it won't make your neural nets any faster.
6510 · 2 months ago
I'm still waiting for my memristors.
quantumHazer · 2 months ago
there is also Normal Computing[0] that are trying different approaches to chips like that. Anyway these are very difficult problems and Extropic already abandoned some of their initial claims about superconductors to pivot to more classical CMOS circuits[1]

[0]: https://www.normalcomputing.com

[1]: https://www.zach.be/p/making-unconventional-computing-practi...

est · 2 months ago
I listened to the Hinton podcast few days ago, he mentioned (IIRC) that "analog" AIs are bad because the models can not be transfered/duplicated in a lossless way, like in .gguf format, every analog system is built differently you have to re-learn/re-train again somehow

Does TSUs have to same issue?

Void_ · 2 months ago
This gives me Devs vibe (2020 TV Series) - https://www.indiewire.com/awards/industry/devs-cinematograph...
vortegne · 2 months ago
That's what they're trying to do, yeah. To give off a cool vibe I mean. To raise more money. There is nothing even remotely as cool in their real (or not) product. I was very excited when they started specifically because of their cool branding, but the vibe quickly wears off.
tcdent · 2 months ago
Such an underrated TV show.
Void_ · 2 months ago
Yes, the billionaire driving a Subaru Forester was my favorite part