Readit News logoReadit News
reaperman · 2 years ago
The article makes no mention at all of how much VRAM they might have. Other articles state plans for up to 192GB of HMB3e, and a power draw of 1000 watts.

Deleted Comment

sp332 · 2 years ago
I know it’s only FP4, but 1.4 exaflops in one rack is still crazy.
alecco · 2 years ago
With scaling and some tricks, FP4 inference can be very close to 16 bit.

And most software out there is still missing on FP8 lol

Deleted Comment

cyanydeez · 2 years ago
Doesn't that suggest the training method is flawed?
epistasis · 2 years ago
> next-gen NVLink switch that lets 576 GPUs talk to each other, with 1.8 terabytes per second of bidirectional bandwidth.

Well that's a pretty big step up from the 18 for the H100...

Dead Comment

Deleted Comment

klysm · 2 years ago
It's terrifying to think about how many watts of power are being spent on generating useless trash
vasco · 2 years ago
Most human consumption is useless trash by some metric.
epistasis · 2 years ago
This is a bit vague, are you talking about this new GPU/LLMs, social media, industrial society, or something else?
reaperman · 2 years ago
Fun fact: each human uses an average 100W of power and generates mostly useless crap.
visarga · 2 years ago
> each human uses an average 100W of power

Does that include heating, electricity and fuel spent for the benefit of the human? How about infrastructure? Externalities are huge.