Readit News logoReadit News
pixelesque · 3 months ago
Note also that today China has told its tech companies to cancel any NVIDIA AI chip orders and not to order any more:

https://www.ft.com/content/12adf92d-3e34-428a-8d61-c91695119...

belter · 3 months ago
200% tariff incoming :-)

"Speaker Johnson says China is straining U.S. relations with Nvidia chip ban" - https://www.cnbc.com/2025/09/17/china-us-nvidia-chip-ban.htm...

Translation: "We are angry with China that they wont let the US undermine itself, and sell its strategic advantages to them..."

littlestymaar · 3 months ago
> "Speaker Johnson says China is straining U.S. relations with Nvidia chip ban"

Oh, the irony.

pxc · 3 months ago
Is this supposed to signal confidence in the chips already available on China's domestic chip market, or is it primarily aimed at boosting that market to make it ready?
bsder · 3 months ago
Yes. :)

How big a deal is it to be on the cutting edge with this? Given that models seem to be flattening out because they can't get any more data, the answer is "not as much as you would think".

Consequently, a generation or 2 behind is annoying, but not fatal. In addition, if you pump the memory up, you can paper over a lot of performance loss. Look at how many people bought amped up Macs because the unified memory was large even though the processing units were underpowered relative to NVIDIA or AMD.

The biggest problem is software. And China has a lot of people to throw at software. The entire RISC-V ecosystem basically only exists because Chinese grad students have been porting everything in the universe over to it.

So, the signal is to everybody around this that the Chinese government is going to pump money at this. And that's a big deal.

People always seem to forget that Moore's Law is a self-fulfilling prophecy, but doesn't just happen out of thin air. It happens because a lot of companies pump a lot of money at the engineering because falling off the semiconductor hamster wheel is death. The US started the domestic hamster wheel with things like VHSIC. TSMC was a direct result of the government pumping money at it. China can absolutely kickstart this for themselves if the money goes where it should.

I'm really torn about this. On the one hand, I hate what China does on many, many political fronts. On the other hand, tech monopolies are pillaging us all and, with no anti-trust action anywhere in the West, the only way to thwart them seems to be by China coming along and ripping them apart.

acdha · 3 months ago
I think those are both the case: they’re telling Chinese companies to invest in domestic hardware–implicitly also saying things like being prepared to stop using CUDA–and that means the hardware vendors know not to skimp on getting there (a nicer version of burning the landing boats on the beach).

It’s also an interesting signal to the rest of the world that they’re going to be an option. American tech companies should be looking at what BYD is doing to Tesla, but they’re also dealing with a change in government to be more like Chinese levels of control but with less maturity.

TrainedMonkey · 3 months ago
My cynical view is that it's mostly trade war and nationalism. If you follow the official PRC position, the chips are already made in China because TW is CN... Practically buying TW chips is boosting it's economy and hence funding it's military so from that perspective that makes sense. From long term development perspective this will absolutely boost national market... however that will take an insane amount of time. If you buy into AI is going to change everything hype, this move is a huge handicap and hence a boon to external economies. And I am probably missing a ton of viewpoints... politics meh
MaoSYJ · 3 months ago
“grey market” smugglers gonna keep working on it
givemeethekeys · 3 months ago
Until now it was perfectly legal to buy nVidia chips in China. It was the US that was blocking export.
seanmcdirmid · 3 months ago
Which way though? From the USA to China or China to the USA?
arbuge · 3 months ago
Are they allowed to rent them from server farms, datacenters, etc. located outside China that are able to procure them?
tonyhart7 · 3 months ago
"Are they allowed to rent them from server farms, datacenters, etc. located outside China that are able to procure them?"

alicloud has many cluster outside china, so they probably can because many friendly country with china has it

but it would be the same with US power play, they only permit anyone that they accept

UltraSane · 3 months ago
That is going to really slow LLM development in China. But more GPUs for everyone else!

Deleted Comment

sameermanek · 3 months ago
I mean, progress is already getting slow in llm development space and their qwen models are, well, good enough for time being. Meanwhile, its good for the world that they are working on their own chips, that way nVidia will have to stop being comfortable.

This is step in good direction for everyone except nvidia and its chinese distribution network

ponector · 3 months ago
They have more electric power (also cheaper), more data centers. I bet AI development will be slower elsewhere.
rich_sasha · 3 months ago
Can someone ELI5 this to me? Nvidia has the market cap of a medium-sized country precisely because apparently (?) no one else can make chips like them. Great tech, hard to manufacture, etc - Intel and AMD are nowhere to be seen. And I can imagine it's very tricky business!

China, admittedly full of smart and hard working people, then just wakes up one day an in a few years covers the entire gap, to within some small error?

How is this consistent? Either:

- The Chinese GPUs are not that good after all

- Nvidia doesn't have any magical secret sauce, and China could easily catch up

- Nvidia IP is real but Chinese people are so smart they can overcome decades of R&D advantage in just s few years

- It's all stolen IP

To be clear, my default guess isn't that it is stolen IP, rather I can't make sense of it. NVDA is valued near infinity, then China just turns around and produces their flagship product without too much sweat..?

rsynnott · 3 months ago
> because apparently (?) no one else can make chips like them

No, that's not really why. It is because nobody else has their _ecosystem_; they have a lot of soft lock-in.

This isn’t just an nvidia thing. Why was Intel so dominant for decades? Largely not due to secret magic technology, but due to _ecosystem_. A PPC601 was substantially faster than a pentium, but of little use to you if your whole ecosystem was x86, say. Now nvidia’s ecosystem advantage isn’t as strong as Intel’s was, but it’s not nothing, either.

(Eventually, even Intel itself was unable to deal with this; Itanium failed miserably, largely due not to external competition but due to competition with the x86, though it did have other issues.)

It’s also notable that nvidia’s adventures in markets where someone _else_ has the ecosystem advantage have been less successful. In particular, see their attempts to break into mobile chip land; realistically, it was easier for most OEMs just to use Qualcomm.

zenmac · 3 months ago
If what you say is true, isn't what one of the big contribution of Deepseek is that they wrote some custom lower level GPU cluster to GPU cluster communication protocol instead using of the nvidia soft ecosystem? And that is open sourced?
robotnikman · 3 months ago
>In particular, see their attempts to break into mobile chip land;

I wouldn't exactly say it was a failure, all those chips ended up being used in the Nintendo Switch

bbatha · 3 months ago
It’s several factors and all of your alternatives are true to some degree:

1. An h20 is about 1.5 generations behind Blackwell. This chip looks closer to about 2 generations behind top end Blackwell chips. So ~5ish years behind is not as impressive especially since EUV is likely going to be a major obstacle to catching up which China has no capacity for

2. Nvidia continues to dominate on the software side. Amd chips have been competitive on paper for a while and have had limited uptake. Now Chinese government mandates could obviously correct this after substantial investment in the software stack — but this is probably several years behind.

3. China has poured trillions of dollars into its academic system and graduates more than 3x the number of electrical engineers the US does. The US immigration system has also been training Chinese students but having a much more limited work visa program has transferred a lot of knowledge back without even touching IP issues

4. Of course ip theft covers some of it

Melatonic · 3 months ago
They also have some insane power generation capability - doesn't seem that far fetched that they just build a shitload of slower chips and eat the costs of lower power efficiency.
bangaladore · 3 months ago
> China has poured trillions of dollars into its academic system and graduates more than 3x the number of electrical engineers the US does.

This metric is not as important as it seems when they have ~5x the population.

FooBarWidget · 3 months ago
What gave you the impression that it's "without too much sweat"? They sweated insanely for the past 6 years.

They also weren't starting from scratch, they already had a domestic semiconductor ecosystem, but it was fragmented and not motivated. The US sanctions united them and gave them motivation.

Also "good" is a matter of perspective. For logic and AI chips they are not Nvidia level, yet. But they've achieved far more than what western commentators gave them credit for 4-5 years ago. And they're just getting started. Even after 6 years, what you're seeing is just the initial results of all that investment. From their perspective, not having Nvidia chips and ASML equipment and TSMC manufacturing is still painful. They're just not paralyzed, and use all that pain to keep developing.

With power chips they're competitive, maybe even ahead. They're very strong at GaN chip design and manufacturing.

Western observers keep getting surprised by China's results because they buy into stereotypes and simple stories too much ("China can't innovate and can only steal", "authoritarianism kills innovation","China is collapsing anyway", "everything is fake, they rely on smuggled chips lol" are just few popular tropes) instead of watching what China is actually doing. Anybody even casually paying attention to news and rumors from China instead of self-congratulating western reports about China could have seen this day coming. This attitude and the phenomenon of keep getting surprised is not limited to semiconductors.

FuriouslyAdrift · 3 months ago
AMDs chips outperform nVidia's (Instinct is the GPU compute line at AMD) and at a lower per watt and per dollar range.

AMD literally can't make enough chips to satisfy demand because nVidia buys up all the fab capacity at TSMC.

greenpizza13 · 3 months ago
Would you care to provide sources?

It's NVIDIA, not nVIDIA. I don't think AMD outperforms NVIDIA chips at price per watt. You need to defend this claim.

markus92 · 3 months ago
Per dollar sure but they’re quite a bit off per watt. Plus the software ecosystem is still not there.
amelius · 3 months ago
My question would be: how did they fab it without access to ASML's high-end lithography machines?

https://www.theguardian.com/technology/2024/jan/02/asml-halt...

FooBarWidget · 3 months ago
They've gone all-in with using less advanced equipment (DUV instead of EUV) but advanced techniques (multi patterning). Also combined with advanced packaging techniques.

Also, they're working hard on replacing ASML DUV machines as well since the US is also sanctioning the higher end of DUV machines. Not to mention multiple parallel R&D tracks for EUV.

You also need to distinguish between design and manufacturing. A lot of Chinese chip news is about design. Lots of Chinese chip designers are not yet sanctioned, and fabricate through TSMC.

Chip design talent pool is important to have, although I find that news a bit boring. The real excitement comes from chip equipment manufacturers, and designers that have been banned from manufacturing with TSMC and need to collaborate with domestic manufacturers.

RyanShook · 3 months ago
I think Alibaba uses TSMC for their foundries, like everyone else. I would assume that they did use ASML machines for this.
BrawnyBadger53 · 3 months ago
The article seems to only depict it being similar to the H20 in memory specs (and still a bit short). Regardless, Nvidia has their moat through cuda, not the hardware.
impossiblefork · 3 months ago
>- Nvidia doesn't have any magical secret sauce, and China could easily catch up

This is the simple explanation. We'll also see European companies matching them in time, probably on inference first.

3eb7988a1663 · 3 months ago
This is more my thinking as well. How many big tech companies are working on their own internal TPU chip? Google's started using them in 2015. It sounds like the basic theory of getting silicon to do matrix multiplication is well established. Sure you can always be more efficient, but getting a working chip sounds very approachable. AMD hardware has been ~competitive the entire time, but they have squandered all good will with their atrocious software support.

If China sees an existential risk to getting compute capacity, I can easily see an internal decree to make something happen. Even if it requires designing the hardware + their own CUDA-like stack.

mdemare · 3 months ago
> has the market cap of a medium-sized country

"According to investors, today's value of Nvidia's expected future profits over its lifetime equals the total monetary value of all final goods and services produced within a medium-sized country in a year."

Don't compare market cap with GDP, when you spell it out it's clear how nonsensical it is.

anothernewdude · 3 months ago
Flagship? No, H20 was their cut down chip they were allowed to sell to China.
tmottabr · 3 months ago
No, that was the H800.

The H200 is the next generation of the H100.

lotsofpulp · 3 months ago
> Nvidia has the market cap of a medium-sized country

This makes no sense. Market cap is share price times number of shares, there is no analog for a country. It’s also not comparable to the GDP of a country, since GDP is a measure of flow in a certain time period, whereas market cap is a point in time measurement of expected performance.

gchadwick · 3 months ago
I'd say there's a mix of 'Chinese GPUs are not that good after all' and 'Nvidia doesn't have any magical secret sauce, and China could easily catch up' going on. Nvidia GPUs are indeed remarkable devices with a complex software stack that offers all kinds of possibilities that you cannot replicate over night (or over a year or two!)

However they've also got a fair amount of generality, anything you might want to do that involves huge amounts of matmuls and vector maths you can probably map to a GPU and do a half decent job of it. This is good for things like model research and exploration of training methods.

Once this is all developed you can cherry pick a few specific things to be good at and build your own GPU concentrating on making those specific things work well (such as inference and training on Transformer architectures) and catch up to Nvidia on those aspects even if you cannot beat or match a GPU on every possible task, however you don't care as you only want to do some specific things well.

This is still hard and model architectures and training approaches are continuously evolving. Simplify things too much and target some ultra specific things and you end up with some pretty useless hardware that won't allow you to develop next year's models, nor run this year's particularly well. You can just develop and run last year's models. So you need to hit a sweet spot between enough flexibility to keep up with developments but don't add so much you have to totally replicate what Nvidia have done.

Ultimately the 'secret sauce' is just years of development producing a very capable architecture that offers huge flexibility across differing workloads. You can short-cut that development by reducing flexibility or not caring your architecture is rubbish at certain things (hence no magical secret sauce). This is still hard and your first gen could suck quite a lot (hence not that good after all) but when you've got a strong desire for an alternative hardware source you can probably put up with a lot of short-term pain for the long-term pay off.

FooBarWidget · 3 months ago
What does "are not good after all" even mean? I feel there are too many value judgements in that question's tone, that blindsides western observers. I feel like the tone has the hidden implication of "this must be fake after all, they're only good at faking/stealing, nothing to see here move along".

Are they as good as Nvidia? No. News reporters have a tendency to hype things up beyond reality. No surprises there.

Are they useless garbage? No.

Can the quality issues be overcome with time and R&D? Yes.

Is being "worse" a necessary interim step to become "good"? Yes.

Are they motivated to become "good"? Yes.

Do they have a market that is willing to wait for them to become "good"? Also yes. It used to be no, but the US created this market for them.

Also, comparing Chinese AI chips to Nvidia is a bit like comparing AWS with Azure. Overcoming compatibility problems is not trivial, you can't just lift and shift your workload to another public cloud, you are best off redesigning your entire infra for the capabilities of the target cloud.

ndai · 3 months ago
Isn’t NVIDIA fabless? I imagine (I jump to conclusions) that design is less of a challenge than manufacturing. EUV lithography is incredibly difficult- almost implausible. Perhaps one day a clever scientist will come up with a new, seemingly implausible, yet less difficult way, using “fractal chemical” doping techniques.
hollerith · 3 months ago
>design is less of a challenge than manufacturing.

If so, can you explain why Nvidia's market cap is much higher than TSMC's? (4.15 trillion versus 1.10 trillion)

SilverElfin · 3 months ago
Perhaps China’s actions are less of a problem for Nvidia and more of a problem for other chip makes. After all, if Alibaba can make this chip, what justifies the valuation of companies like Groq?
spacephysics · 3 months ago
Defaulting to China stealing IP is a perfectly reasonable first step.

China is known for their countless theft of Europe and especially American IP, selling it for a quarter of the price, and destroying the original company nearly overnight.

Its so bad even NASA has begun to restrict hiring Chinese nationals (which is more national defense, however illegally killing American companies can be seen as a national defense threat as well)

https://www.bbc.com/news/articles/c9wd5qpekkvo.amp

https://www.csis.org/analysis/how-chinese-communist-party-us...

robotnikman · 3 months ago
I'm not sure why you are being downvoted, this is well known knowledge and many hacks in the past decade and a half involved exfiltrating stolen IP from various companies.
fearmerchant · 3 months ago
China's corporate espionage might have surpassed France at the winners podium.
buckle8017 · 3 months ago
It's all stolen IP.

Virtually all products out of china still are.

If you want something manufacturered the best way is still to fake a successful crowd sourcing campaign.

You'll be able to buy whatever it is on AliExpress (minus any safety features) within 6 months.

edm0nd · 3 months ago
Yup this right here. The Chinese are estimated to steal hundreds of billions of dollars worth of US IP every single year. It's the Chinese way, they just steal or copy everything. Whatever gets them ahead.
tsoukase · 3 months ago
Just some 2c totally out my head:

- Chinese labs managed to "overcome decades of R&D" because they have been trying for many years now with unlimited resources, government support and total disrespect of IP laws

- Chinese chips may not be competitive at process power/W with Western but they have cheaper electricity and again unlimited loss capacity

- they will probably hit wall at the software/ecosystem level. CUDA ergonomy is something very difficult to replicate and, you know, developers love ease of use

notfried · 3 months ago
If CUDA isn't that strong of a moat/tie-in and Chinese tech companies can seemingly reasonably migrate to these chips, why hasn't AMD been able to compete more aggressively with nVidia on a US/global scale when they had a much longer head start?
brookst · 3 months ago
1. AMD isn’t different enough. They’d be subject to the same export restrictions and political instability as Nvidia, so why would global companies switch to them?

2. CUDA has been a huge moat, but the incentives are incredibly strong for everybody except Nvidia to change that. The fact that it was an insurmountable moat five years ago in a $5B market does not mean it’s equally powerful in a $300B market.

3. AMD’s culture and core competencies are really not aligned to playing disruptor here. Nvidia is generally more agile and more experimental. It would have taken a serious pivot years ago for AMD to be the right company to compete.

FuriouslyAdrift · 3 months ago
AMD is HIGHLY successful in the GPU compute market. They have the Instinct line which actually outperforms most nVidia chips for less money.

It's the CUDA software ecosystem they have not been able to overcome. AMD has had multiple ecosystem stalls but it does appear that ROCm is finally taking off which is open source and multi-vendor.

AMD is unifying their GPU architectures (like nVidia) for the next gen to be able to subsidize development by gaming, etc., card sales (like nVidia).

bjornsing · 3 months ago
> CUDA has been a huge moat

The CUDA moat is extremely exaggerated for deep learning, especially for inference. It’s simply not hard to do matrix multiplication and a few activation functions here and there.

danesparza · 3 months ago
And it would be a big bet for AMD. They don't create and manufacture chips 'just in time' -- it takes man hours and MONEY to spin up a fab, not to mention marketing dollars.
belval · 3 months ago
> If CUDA isn't that strong of a moat/tie-in and Chinese tech companies can seemingly reasonably migrate to these chips, why hasn't AMD been able to compete more aggressively with nVidia on a US/global scale when they had a much longer head start?

It's all about investment. If you are a random company you don't want to sink millions in figuring out how to use AMD so you apply the tried an true "no one gets fired for buying Nvidia".

If you are an authoritarian state with some level of control over domestic companies, that calculus does not exist. You can just ban Nvidia chips and force to learn how to use the new thing. By using the new thing an ecosystem gets built around it.

It's the beauty of centralized controlled in the face of free markets and I don't doubt that it will pay-off for them.

PunchyHamster · 3 months ago
I think they'd be entirely fine just using NVIDIA, and most of the push came from US itself trying to ban export (or "export", as NVIDIA cards are put together in the china factories...).

Also AMD really didn't invest enough in making their software experience as nice as NVIDIA.

ithkuil · 3 months ago
Are there precedents where an authoritarian state outperformed the free market in technological innovation?

Or would china be different because it's a mix of market and centralized rule?

eunos · 3 months ago
Because Cuda moat in China is wrecked artificially by political reason rather than technical reason
nextworddev · 3 months ago
This is the right answer
buyucu · 3 months ago
I use AMD MI300s at work, and my experience is that for PyTorch at least there is no moat. The moat only exists in people's minds.

Until 2022 or so AMD was not really investing into their software stack. Once they did, they caught up with Nvidia.

imtringued · 3 months ago
The only way the average person can access a MI300 is through the AMD developer cloud trial which gives you a mere 25 hours to test your software. Meanwhile NVidia hands out entire GPUs for free to research labs.

If AMD really wanted to play in the same league as NVidia, they should have built their own cloud service and offered a full stack experience akin to Google with their TPUs, then they would be justified in ignoring the consumer market, but alas, most people run their software on their local hardware first.

chii · 3 months ago
AMD probably don't have chinese state backing, presumably, where profit is less of a concern and they can do it unprofitably for many years (decades even) as long as the end outcome is dominance.
shrubble · 3 months ago
Sadly, AMD and its precursor graphics company, ATI, have had garbage driver software since literally the mid-1990s.

They have never had a focus on top notch software development.

Deleted Comment

baq · 3 months ago
CUDA isn't a moat... in China. The culture is much more NIH there.
sampton · 3 months ago
Because Chinese government can tell their companies to adopt Chinese tech and they will do it. Short term pain for long term gain.
2OEH8eoCRo0 · 3 months ago
It's interesting that CUDA is a moat because if AI really was as good as they claim then wouldn't the CUDA moat evaporate?
random3 · 3 months ago
Exactly. The whole argument that software is a moat is at best a temporary illusion. The supply chain is the moat, software is not.
dworks · 3 months ago
Most chipmakers in China are making or have made their new generation of products CUDA-compatible.
belter · 3 months ago
Do you know how bad AMD is at doing drivers and Software in general?
FrustratedMonky · 3 months ago
People are trying to break the moat.

See, Mojo, a new language to compile to other chips. https://www.modular.com/mojo

PunchyHamster · 3 months ago
I don't think "learn entirely new language" is all that appealing vs "just buy NVIDIA cards"
buckle8017 · 3 months ago
CUDA is a legal moat.

A reimplantation would run into copyright issues.

No such problem in China.

cedws · 3 months ago
Apparently DeepSeek’s new model has been delayed due to issues with the Huawei chips they’re using. Maybe raw floating point performance of Chinese chips is competitive with NVIDIA, but clearly there’s still a lot of issues to iron out.
elp · 3 months ago
I'm sure there are LOTS of issues that need to be addressed, but the demand for the chips are so high that the incentives are overwhelmingly in favor of this continuing. If the reported margins on the Nvidia chips are as high as the claims make it out to be (73+% ??) this will easily find a world wide market.

It was also frustratingly predictable from the moment the US started trying to limit the sales of the chips. America has slowed the speed of Chinese AI development by a tiny number of years, if that, in return for losing total domination of the GPU market.

johndhi · 3 months ago
>America has slowed the speed of Chinese AI development by a tiny number of years, if that, in return for losing total domination of the GPU market.

I'm open to considering the argument that banning exports of a thing creates a market incentive for the people impacted by the ban to build aa better and cheaper thing themselves, but I don't think it's as black and white as you say.

If the only ingredient needed to support massive innovation and cost cutting is banning exports, wouldn't we have tons of examples of that happening already - like in Russia or Korea or Cuba? Additionally, even if the sale of NVIDIA H100s weren't banned in China, doesn't China already have a massive incentive to throw resources behind creating competitive chips?

I actually don't really like export bans, generally, and certainly not long-term ones. But I think you (and many other people in the public) are overstating the direct connection between banning exports of a thing and the affected country generating a competing or better product quickly.

smokefoot · 3 months ago
I mean, I don’t know how long the NVIDIA moats can hold. With this much money at stake, others will challenge their dominance especially in a market as diverse and fragmented as advanced semiconductors.

That’s not to say I’m brave enough to short NVDA.

catigula · 3 months ago
Slowing AI development by even one month is essentially infinite slowness in terms of superintelligence development. It's a kill-shot, a massive policy success.

Lost months are lost exponentially and it becomes impossible to catch up. If this policy worked at all, let alone if it worked as you describe, this was a masterstroke of foreign policy.

This isn't merely my opinion, experts in this field feel superintelligence is at least possible, if not plausible. This is a massively successful policy is true, and, if it's not, little is lost. You've made a very strong case for it.

cshores · 3 months ago
The Chinese state operates the country much like a vast conglomerate, inheriting many of the drawbacks of a massive corporation. When top leadership imposes changes in tools or systems without giving engineers the opportunity to run proof-of-concept trials and integration tests to validate feasibility, problems like these inevitably arise.

Deleted Comment

maxglute · 3 months ago
Reported by one of the more least credible PRC reporters on FT who should be thoroughly ignored.
torginus · 3 months ago
There's a very important point made in the article - with recent export controls, domestic Chinese firms don't need to beat Nvidia's best, but only the cut-down chips cleared for Chinese export.
jarym · 3 months ago
The AI race is like the nuclear arms race. Countries like China will devote an inordinate amount of resources to be the best - it may take a year or two, but in the grand scheme of things that is nothing.

And NVIDIA will lose its dominance for the simple reason that the Chinese companies can serve the growing number of countries under US sanctions. I even suspect it won't be long before the US will try to sanction any allies that buy Chinese AI chips!

rhetocj23 · 3 months ago
China and Russia collectively have a talent pool dense enough to build future products and services the rest of the world uses, if China can produce comparative hardware for AI.

Simple example being TikTok.

Its just a matter of time really.

WhereIsTheTruth · 3 months ago
> And NVIDIA will lose its dominance

They are vendor locking industries, i don't think they'll loose their dominance, however, vendor locked companies will loose their competitiveness

MangoToupe · 3 months ago
Indeed. You could (and probably should) view the export restrictions as a subsidy for chinese manufacturing.
TSiege · 3 months ago
This is not true and a lot of Nvidi’s chips are smuggling into the country. There’s a ton of domestic pressure to be the leading chip producers. It’s part of China’s strategic plan called Made in China 2025
neworder56 · 3 months ago
Considering the fact China controls most of the world supply of rare minerals, considering the fact the US is lead by a incompetent leader, considering the fact Nvidia looses a big market, I think China can compete with even the leading Nvidia chips in a couple of years time.

If that happens, China in turn can export those Chips to countries that are in dire need of Chips, like Russia. They can export to Africa, South-America and the rest of Asia. Thus resulting in more competition for Nvidia. I see bright times ahead, where the USA no longer controls all of the worlds chip supply and OS systems.

I see this as an absolute win.

Citizen_Lame · 3 months ago
China doesn't control the supply of rare minerals but rather production. Rare minerals are not really rare, but the processing them is a "dirty" business and does lot of damage to environment.

China has managed to monopolise the production (cheap prices) and advance the refinement process, so other domestic projects to extract rare earth minerals were not really profitable. To start it again would take some time.

eagerpace · 3 months ago
Why do we look at these as a race? There is nothing to win. Nobody won space, or nukes, and they won’t win AI. You might get there first, but your competitor will get there soon after regardless. Embrace it.
TechSquidTV · 3 months ago
We win. The companies think they'll "win", and I'm fine letting them. The race is good for us.

Deleted Comment

xorcist · 3 months ago
There is no us and them!

But them, they do not think the same.

CamperBob2 · 3 months ago
Huh? Things would certainly have turned out very differently if Nazi Germany or Imperial Japan had won the nuke race.
MonkeyClub · 3 months ago
This conveniently coincides with China banning purchases of Nvidia AI chips:

https://news.ycombinator.com/item?id=45275070