Readit News logoReadit News

Deleted Comment

ndai commented on Write in C – Let it Be   wiki.tcl-lang.org/page/Wr... · Posted by u/axiomdata316
ndai · 2 months ago
No pip freeze to lock the doom, No tangled trees in darkened bloom, No maintainers tricked by phishing spree — In C I hold the memory key.

Through buffer, pointer, syscall roar, I own the land, I own the shore; Let Python’s spiders weave their scheme, I’ll keep my ship rock-steady in C-stream.

ndai commented on Alibaba's new AI chip: Key specifications comparable to H20   news.futunn.com/en/post/6... · Posted by u/dworks
hollerith · 3 months ago
>design is less of a challenge than manufacturing.

If so, can you explain why Nvidia's market cap is much higher than TSMC's? (4.15 trillion versus 1.10 trillion)

ndai · 3 months ago
You could be right. But it could also be due to things like: automatic 401k injections into the market, easy retail investing, and general speculative attitudes.
ndai commented on Alibaba's new AI chip: Key specifications comparable to H20   news.futunn.com/en/post/6... · Posted by u/dworks
rich_sasha · 3 months ago
Can someone ELI5 this to me? Nvidia has the market cap of a medium-sized country precisely because apparently (?) no one else can make chips like them. Great tech, hard to manufacture, etc - Intel and AMD are nowhere to be seen. And I can imagine it's very tricky business!

China, admittedly full of smart and hard working people, then just wakes up one day an in a few years covers the entire gap, to within some small error?

How is this consistent? Either:

- The Chinese GPUs are not that good after all

- Nvidia doesn't have any magical secret sauce, and China could easily catch up

- Nvidia IP is real but Chinese people are so smart they can overcome decades of R&D advantage in just s few years

- It's all stolen IP

To be clear, my default guess isn't that it is stolen IP, rather I can't make sense of it. NVDA is valued near infinity, then China just turns around and produces their flagship product without too much sweat..?

ndai · 3 months ago
Isn’t NVIDIA fabless? I imagine (I jump to conclusions) that design is less of a challenge than manufacturing. EUV lithography is incredibly difficult- almost implausible. Perhaps one day a clever scientist will come up with a new, seemingly implausible, yet less difficult way, using “fractal chemical” doping techniques.

Deleted Comment

Deleted Comment

Deleted Comment

ndai commented on RustGPT: A pure-Rust transformer LLM built from scratch   github.com/tekaratzas/Rus... · Posted by u/amazonhut
ndai · 3 months ago
I’m curious where you got your training data? I will look myself, but saw this and thought I’d ask. I have a CPU-first, no-backprop architecture that works very well on classification datasets. It can do single‑example incremental updates which might be useful for continuous learning. I made a toy demo to train on tiny.txt and it can predict next characters, but I’ve never tried to make an LLM before. I think my architecture might work well as an on-device assistant or for on-premises needs, but I want to work with it more before I embarrass myself. Any open-source LLM training datasets you would recommend?

Deleted Comment

u/ndai

KarmaCake day10September 3, 2025View Original