Readit News logoReadit News
freedomben · 5 days ago
Perhaps it's a cynical way to look at it, but in the days of the war on general purpose computing, and locked-down devices, I have to consider the news in terms of how it could be used against the users and device owners. I don't know enough to provide useful analysis so I won't try, but instead pose as questions to the much smarter people who might have some interesting thoughts to share.

There are two, non-exclusive paths I'm thinking at the moment:

1. DRM: Might this enable a next level of DRM?

2. Hardware attestation: Might this enable a deeper level of hardware attestation?

gpapilion · 5 days ago
Just to level set here. I think its important to realize this is really focused on allowing things like search to operate on encrypted data. This technique allows you to perform an operation on the data without decrypting it. Think a row in a database with email, first, last, and mailing address. You want to search by email to retrieve the other data, but don't want that data unencrypted since it is PII.

In general, this solution would be expensive and targeted at data lakes, or areas where you want to run computation but not necessarily expose the data.

With regard to DRM, one key thing to remember is that it has to be cheap, and widely deployable. Part of the reason dvds were easily broken is that the algorithm chosen was inexpensive both computationally, so you can install it on as many clients as possible.

15155 · 4 days ago
DVD players also didn't have a great key revocation and forced field updates of keys and software and such. Blu Ray did, and was somewhat more effective. I also imagine console manufacturers have far more control over the supply chain at large.

Consoles after the original Xbox (which had an epic piracy ecosystem) all had online integration. The Xbox 360 had a massive piracy scene, but it was 100% offline only. The Xbox One has had no such breaches that I am aware of.

RE: BOM - famously, with many of these examples, certain specific disc drives or mainboards were far more compromised than others.

jasomill · 5 days ago
This is an exceptionally good point. For example, I suspect two major reasons DRM has been more successful on game consoles than video players are the much smaller ecosystems and much larger BOMs, not necessarily in that order.
jackyinger · 4 days ago
How is searching encrypted data not going to be used for exfiltration? What a terrible idea.

I’m sure you can name benign useful things you could use it for. But it seems to me you’re blatantly overlooking the obvious flaw.

There is no getting around doing search on encrypted data reducing the level of secrecy. To have an even minutely useful search result, some information within the searched corpus must be exposed.

egorfine · 5 days ago
> how it could be used against the users and device owners

Same here.

Can't wait to KYC myself in order to use a CPU.

observationist · 5 days ago
KYC = Kill Your Conscience

It's truly amazing how modern people just blithely sacrifice their privacy and integrity for no good reason. Just to let big tech corporations more efficiently siphon money out of the market. And then they fight you passionately when you call out those companies for being unnecessarily invasive and intrusive.

The four horsemen of the infocalypse are such profoundly reliable boogeymen, we really need a huge psychological study across all modern cultures to see why they're so effective at dismantling rational thought in the general public, and how we can innoculate society against it without damaging other important social behaviors.

Frieren · 5 days ago
> how it could be used against the users

We are not anymore their clients, we are just another product to sell. So, they do not design chips for us but for the benefit of other corporations.

3. Unskippable ads with data gathering at the CPU level.

dimitrios1 · 5 days ago
I distinctly remember from university in one of my more senior classes designing logic gates, chaining together ands, nands, ors, nors, xors, and then working our way up to numerical processors, ALUs, and eventually latches, RAM, and CPUs. The capstone was creating an assembly to control it all.

I remember how thinking how fun it was! I could see unfolded before me how there would be endless ways to configure, reconfigure, optimize, etc.

I know there are a few open source chip efforts, but wondering maybe now is the time to pull the community together and organize more intentionally around that. Maybe open source chipsets won't be as fast as their corporate counterparts, but I think we are definitely at an inflection point now in society where we would need this to maintain freedom.

If anyone is working in that area, I am very interested. I am very green, but still have the old textbooks I could dust off (just don't have the ole college provided mentor graphics -- or I guess siemens now -- design tool anymore).

youknownothing · 5 days ago
I don't think it's applicable to DRM because you eventually need the decrypted content: DRM is typically used for books, music, video, etc., you can't enjoy an encrypted video.

I think eGovernment is the main use case: not super high traffic (we're not voting every day), but very high privacy expectations.

freedomben · 5 days ago
Yes it must be decrypted eventually, but I've read about systems (I think HDMI does this) where the keys are stored in the end device (like the TV or monitor) that the user can't access. Given that we already have that, I think I agree that this news doesn't change anything, but I wonder if there are clever uses I haven't thought of
gruez · 5 days ago
See: https://news.ycombinator.com/item?id=47323743

It's not related to DRM or trusted computing.

inetknght · 5 days ago
Not yet.
monocasa · 5 days ago
I mean, this would be perfect for the key provisioning portions of widevine or bluray.
vasco · 5 days ago
Regarding DRM I don't see how it'll survive "Camera in front of the screen" + "AI video upscaling" once the second part is good enough. Can't DRM between the screen and your eyes. Until they put DRM in Neuralink.
RiverCrochet · 5 days ago
> Can't DRM between the screen and your eyes.

No, but media can be watermarked in imperceptible ways, and then if all players are required to check and act on such watermarks, the gap becomes narrow enough to probably be effective.

See Cinavia.

benlivengood · 5 days ago
1. The private key is required to see anything computed under FHE, so DRM is pretty unlikely.

2. No, anyone can run the FHE computations anywhere on any hardware if they have the evaluation key (which would also have to be present in any FHE hardware).

ddtaylor · 5 days ago
HDCP does some of that already in many of your devices.
amelius · 5 days ago
I'm also thinking of what happens when quantum computing becomes available.

But when homomorphic encryption becomes efficient, perhaps governments can force companies to apply it (though they would lose their opportunity for backdooring, but E2EE is a thing too so I wouldn't worry too much).

evolve2k · 5 days ago
My thought is half cynical. As LLM crawlers seek to mop up absolutely everything, companies themselves start to worry more about keeping their own data secret. Maybe this is a reason for shifts like this; as encrypted and other privacy-preserving products become more in demand across the board.
F7F7F7 · 5 days ago
When we are at the point where society feels the need that privacy means encryption at compute ... a product like this (or anything else in the supply chain) is not going to save them.

Deleted Comment

mathgradthrow · 5 days ago
No, because of the fundamental limitation of DRM. Content must be delivered as plaintext.
KoolKat23 · 5 days ago
This is quite the opposite, better than we have.

It raises the hurdle for those looking to surveil.

If a tree falls in the forest and no one is around to hear it, does it make a sound?

This is primarily for cloud compute I'd imagine, AI specifically. As it's generally not feasible/possible to run the state of the art models locally. Think GDPR and data sovereignty concerns, many demand privacy and can't use services without it.

observationist · 5 days ago
Regarding DRM, You could use stream ciphers and other well understood cryptography schemes to use a FHE chip like this to create an effectively tamper-proof and interception proof OS, with the FHE chip supplementing normal processors. You'd basically be setting up e2ee between the streaming server and the display, audio output, or other stream target, and there'd be no way to intercept or inspect unencrypted data without breaking the device. Put in modern tamper detection and you get a very secure setup, with modern performance, and a FHE chip basically just handling keys and encapsulation operations, fairly low compute and bandwidth needs. DRM and attestation both, as well as fairly dystopian manufacturer and corporate controls over devices users should own.
brookst · 4 days ago
You’re right it’s a cynical take. I don’t get cynicism for the sake of it, detached from technical reality.

No, this does nothing for DRM or HW attestation. The interesting thought is: not everything is a conspiracy. Yes, that’s just what a conspirator would say. But it’s also true.

coliveira · 4 days ago
Not everything is a conspiracy, yes. But when we have a class of conspirators in power, and we do have, everything can be used by the conspiracy.
zvqcMMV6Zcr · 5 days ago
> Heracles, which sped up FHE computing tasks as much as 5,000-fold compared to a top-of the-line Intel server CPU.

That is nice speed-up compared to generic hardware but everyone probably wants to know how much slower it is than performing same operations on plain text data? I am sure 50% penalty is acceptable, 95% is probably not.

corysama · 5 days ago
There are applications that are currently doing this without hardware support and accepting much worse than 95% performance loss to do so.

This hardware won’t make the technique attractive for ALL computation. But, it could dramatically increase the range of applications.

bobbiechen · 5 days ago
Agreed. When I was working on TEEs/confidential computing, just about everyone agreed that FHE was conceptually attractive (trust the math instead of trusting a hardware vendor) but the overhead of FHE was so insanely high. Think 1000x slowdowns turning your hour-long batch job into something that takes over a month to run instead.

Deleted Comment

Foobar8568 · 5 days ago
Now we know why Intel more or less abandonned SEAL and rejected GPU requests.
Foobar8568 · 4 days ago
It's Microsoft who did the library, damn, I can't understand how I misremembered that after working on it for a few months last year.

Dead Comment

bilekas · 5 days ago
This is incredible work.. And makes the technology absolutely viable.

However... In a world where privacy is constantly being eroded intentionally by governments and private companies, I think this will NEVER, ever reach any consumer grade hardware. My cynic could envision the technology export ban worldwide in the vein of RSA [0] .

Why would any company offer the customers real out of the box e2e encryption possibilities built into their devices.

DRM was mentioned by another user. This will not be used to enable privacy for the masses.

https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...

FrasiertheLion · 5 days ago
Arguably this is less useful for consumer hardware in the first place. This is mostly useful when I don’t trust the service provider with my data but still need to use their services (casting my vote, encrypted inference, and so forth)
bilekas · 5 days ago
True, in the case of casting a vote though for example, I would see it being used within the voting machines itself before sending off to be counted. Good application.

But getting them available for customers for example say even a PCIe card or something and then that automatically encrypting everything you ever run today over an encrypted connection would be a dream.

autoexec · 5 days ago
> In a world where privacy is constantly being eroded intentionally by governments and private companies, I think this will NEVER, ever reach any consumer grade hardware.

Why not when government can just force companies to backdoor their hardware for them. That way users are secure most of the time except from the government (until the backdoor in intel's chips gets discovered anyway), and users have a false sense of security/privacy so people are more likely to share their secrets with corporations and the government gets to spy on people communicating more openly with each other.

sota_pop · 2 days ago
Is this whole concept essentially a fundamental misunderstanding of the difference between "encryption" and "encoding"? I don't mean to be pedantic and don't want to make assumptions due to my respect for the source, but I don't understand how you can meaningfully manipulate the data that has been _actually_ encrypted? Doesn't the ability to accurately manipulate it imply that you have some understanding of its underlying meaning? The article is light on algorithmic details:

> "...a mathematical transformation, sort of like the Fourier transform. It encrypts data using a quantum-computer-proof algorithm..."

I am assuming there is some deep learning at play here i.e. it is manipulating the data within the latent space. If this is true, then would the embedding process really be considered "encryption"? You could argue it is security through obscurity (in the sense that the latent space basis is arbitrary/learned), but it feels like two different things to me.

subset · a day ago
(Disclaimer: I am not a cryptographer and this is a heavily simplified explanation). Homomorphic encryption is built on the foundation of 'hard problems' (e.g. the Learning with Errors Problem) - loosely, computational problems that are thought to be impossible to reverse without being in the possession of a secret key.

The crux of HE is that it provides a _homomorphism_: you map from the space of plaintext to the space of cipher texts, but the mapping preserves arithmetic properties such as addition and multiplication. To be clear - this means that the server can add and multiply the cipher texts, but the plaintext result of that operation is still irreversible without the private key. To the server, it looks like random noise.

I don't think it's helpful to think about this as connected to deep learning or embedding spaces. An excellent resource I'd recommend is Jeremy Kun's guide: https://www.jeremykun.com/2024/05/04/fhe-overview/

mmaunder · 5 days ago
Someone explain how you'd create a vector embedding using homomorphically encrypted data, without decrypting it. Seems like a catch 22. You don't get to know the semantic meaning, but need the semantic meaning to position it in high dimensional space. I guess the point I'm making is that sure, you can sell compute for FHE, but you quickly run up against a hard limit on any value added SaaS you can provide the customer. This feels like a solution that's being shoehorned in because cloud providers really really really want to have a customer use their data center, when in truth the best solution would be a secure facility for the customer so that applications can actually understand the data they're working with.
bob1029 · 5 days ago
Most of modern machine learning is effectively linear algebra. We can achieve semantic search over encrypted vectors if the encryption relies on similar principles.
Chance-Device · 5 days ago
FHE is the future of AI. I predict local models with encrypted weights will become the norm. Both privacy preserving (insofar as anything on our devices can be) and locked down to prevent misuse. It may not be pretty but I think this is where we will end up.
boramalper · 5 days ago
If you're interested in "private AI", see Confer [0] by Moxie Marlinspike, the founder of Signal private messaging app. They go into more detail in their blog. [1]

[0] https://confer.to/

[1] https://confer.to/blog/2025/12/confessions-to-a-data-lake/

CamperBob2 · 5 days ago
I don't get how this can work, and Moxie (or rather his LLM) never bothers to explain. How can an LLM possibly exchange encrypted text with the user without decrypting it?

The correct solution isn't yet another cloud service, but rather local models.

Reptur · 5 days ago
If encrypted outputs can be viewed or used, they can be reverse-engineered through that same interface. FHE shifts the attack surface, it does not eliminate it.
Chance-Device · 5 days ago
If you know how to reverse engineer weights or even hidden states through simple text output without logprobs I’d be interested in hearing about it. I imagine a lot of other people would be too.
anon291 · 5 days ago
I mean, no they cannot be viewed at any point once encrypted unless you have the key. That's the point. Even the intermediate steps are random gibberish unless you have the key
Foobar8568 · 5 days ago
FHE is impractical by all means. Either it's trivially broken and unsecured or the space requirements go beyond anything usable.

There is basically no business demand beside from sellers and scholars.

eulgro · 5 days ago
In science fiction maybe. We're hitting real limits on compute while AI is still far from a level where it would harmful, and FHE is orders of magnitude less efficient than direct calculation.
kittikitti · 4 days ago
I find it petty for Intel to describe more software-based solutions to fully homomorphic encryption (FHM) as "software cheats". This is especially since their competition, Duality Technologies, specializes more on the software side and they are certainly much smaller in size.

When you have giant corporations like Intel being able to label their smaller competition's technology as "software cheats", then it becomes an incredibly toxic environment. If anyone were to do it to Intel, they would be sued for libel and slander and other anti-competitive tactics.

However, I shouldn't be surprised. The industry normalizes this type of discourse. At the same time, the same giant corporations will preach about AI safety and claim you can only trust them with it.

That being said, this is a great innovation by Intel. I was impressed at their technology and the thorough discussion about how this type of computing is related to GPU's and CPU's. It's especially interesting given the emergence of computational memory applications.

Deleted Comment