Readit News logoReadit News
djoldman · a year ago
mkl · a year ago
Interesting that the hardware is NVidia Blackwell, not Google TPUs. That means Google will likely have an energy efficiency and cost advantage, and keep their proprietary hardware out of other people's reach.
crowcroft · a year ago
Getting a whole business set up to build TPU hardware for third parties (design, build, sell, support, etc.) is probably not worth it when there is overflowing demand for TPUs in their cloud already.

Businesses running their own hardware probably prefer CUDA as well for being more generally useful.

bitexploder · a year ago
Part of the reason for this is likely due to customers preference to have CUDA available which TPUs do not support. TPU is superior for many use cases but customers like the portability of targeting CUDA
re-thc · a year ago
> not Google TPUs

They're in limited supply. Even Google doesn't have enough for their own use.

WalterGR · a year ago
Google doesn’t make TPUs available to 3rd parties, right? I assume there would be tremendous reverse-engineering risk if they were to include them?
DSingularity · a year ago
It’s telling that their effort was to get an air gapped solution cleared for US government and the US military.
rwmj · a year ago
A bit thin on detail, but will this require confidential VMs with encrypted GPUs? (And I wonder how long before someone cracks SEV-SNP and TDX and pirate copies escape into the wild.)
vasco · a year ago
At the pace models improve, the advantage of going the dark route shouldn't really hold for long, unless I'm missing something.
miohtama · a year ago
Access to proprietary training data: Search, YouTube, Google Books might give some moat.
unsnap_biceps · a year ago
The number of folks that have the hardware at home to run it is going to be very low and the risk of companies for leaking it is gonna make it unlikely IMHO.
notpushkin · a year ago
I think home users would be the least of their concerns.
RadiozRadioz · a year ago
It only takes one company to leak it
BiteCode_dev · a year ago
They can get "hacked" and wooops.
bjackman · a year ago
> I wonder how long before someone cracks SEV-SNP

https://bughunters.google.com/blog/5424842357473280/zen-and-...

NoahZuniga · a year ago
I'd expect watermarked model weights plus a lot of liability to distinctivise leaking the model.

Deleted Comment

nsriv · a year ago
This might be a great way for them to strengthen their model through federated learning.

https://federated.withgoogle.com/

amarcheschi · a year ago
I did my undergrad internship on federated learning. I was tasked with implementing in a simulator different federated algorithms, so to have a way to compare them in a meaningful way. The last that had to be implemented was FedMA. We didn't manage to do it. That algorithm is absolutely devilish. Every issue that I solved made other two issue arise, and neither my supervisors could help. The sheer idea of matching neurons in different networks might (and does) make sense, but the way the approximate costs are calculated require other 2/3 math papers that I could follow for only the first lines of the abstract. I'm happy for the time I spent in my internship there. I'm also happy it's over

The general understanding of how it works is surprisingly easy though, you can find the paper here https://arxiv.org/abs/2002.06440

ein0p · a year ago
The whole point of deploying such things on-prem is air-gapping it from Google and its "learning".
fc417fc802 · a year ago
That's the point of the privacy scheme. It would only be able to learn things common to multiple clients. Private data wouldn't make it through the noise.
holografix · a year ago
This is obvious government contract baiting. Kudos though they might actually move some Google Distributed Cloud this way
noitpmeder · a year ago
Financial firms with significant on-prem datacenter use will love this as well. My company still stays away from the cloud -- we have 6 DCs in the building, and run everything else out of colocated racks.
brcmthrowaway · a year ago
Who provides internet
aduffy · a year ago
I don’t think so. To my knowledge GCP has no approval for classified networks, which is by far the hardest part. Contrast with Azure OpenAI has been approved to run on government networks for over a year now.

This feels like a play for companies in highly regulated industries, GCP has a notable list of biopharma customers.

Maxious · a year ago
>Today at Google Cloud Next, we're thrilled to announce another significant milestone for Google Public Sector: the authorization of Google Distributed Cloud Hosted (GDC Hosted) to host Top Secret and Secret missions for the U.S. Intelligence Community, and Top Secret missions for the Department of Defense (DoD).

https://cloud.google.com/blog/topics/public-sector/google-pu...

ZeroCool2u · a year ago
FedRAMP High is the mark you really want to hit for the US Government and GCP's service coverage is surprisingly broad in that realm.
skybrian · a year ago
From Google's blog post:

> Our GDC air-gapped product, which is now authorized for US Government Secret and Top Secret missions, and on which Gemini is available, provides the highest levels of security and compliance.

nkassis · a year ago
Banking as well, this is the kind of offering they've been looking for a while. Google just saw the demand decided to jump in while OpenAI and Anthropic probably calculated they don't have the manpower to deal with the support for this.

Deleted Comment

reaperducer · a year ago
This is obvious government contract baiting

You don't have to be a government agency to not want your company's data all over the place.

connicpu · a year ago
With a few exceptions for companies with highly secretive data, you do have to be a government agency or working in a highly regulated government-adjacent area for secured private clouds to be a requirement carved in stone and therefore worth investing a ton of extra money into though.
culopatin · a year ago
They’ll have to fight Microsoft who’s been promising copilot.
_cs2017_ · a year ago
Curious if this was forced on Google Cloud by Sundar, or was it something that Google Cloud as an org wanted to do?

At first glance, it seems Google Cloud might lose some revenue from customers who can now deploy Gemini in-house. On the other hand, it's not a complete loss, since presumably Google Cloud is still involved in providing some underlying tech? Not to mention, some customers would never consider using off-premises setup anyway.

wmf · a year ago
I assume Google Distributed Cloud is part of the larger Cloud org so they get the revenue either way. The on-prem version may even cost more.
throwaway48476 · a year ago
Like you can with deep seek? Or will it be more complicated and expensive. I don't know who would actually want that.
yoavm · a year ago
Absolutely many would, especially those with deep pockets. The biggest concern I'm hearing from companies adopting AI, for basically any use case, is data leaving their network. Especially (but not only) in the EU.
throwaway48476 · a year ago
Deepseek is just the model weights. Nothing about it requires network access.
surajrmal · a year ago
Folks who would prefer to run deepseek are not in the end customer for this product. Deepseek doesn't provide a service contract.
tziki · a year ago
I don't understand how Google is willing to do this but won't sell TPUs to other days centers. It should be obvious from Nvidia's market cap that they're missing a huge opportunity.
dehrmann · a year ago
The only reasons I can think of is they see them as their secret sauce, they don't want to support them for customers long-term, or they don't have the foundry capacity.
paxys · a year ago
It's definitely #3. The GPUs have to first satisfy Google's own computing needs, and only then can they start selling them to others. Given how much training and inference the company is doing and how much demand there is internally it's very unlikely they are able to manufacture loads of extras, especially not profitably.

Deleted Comment

Deleted Comment