Readit News logoReadit News
plastic-enjoyer commented on We’re Not So Special: A new book challenges human exceptionalism   democracyjournal.org/maga... · Posted by u/nobet
joduplessis · 9 days ago
These sorts of (sorry to say, but) dumb articles (+ books) I don't expect to see on HN.
plastic-enjoyer · 9 days ago
You see this stuff everyday here in the form of AI hype. The whole ideology behind the powers that drive AI is that humans are dumb and inefficient and need to be replaced by artificial intelligence as the next stage in human evolution.
plastic-enjoyer commented on Anna's Archive: An Update from the Team   annas-archive.org/blog/an... · Posted by u/jerheinze
lolive · 11 days ago
I choose the books I buy, from Anna's Archive. I choose the comics I buy from readComicsOnline. I choose the [european] graphic novels I buy from #WONTTELL.

And I am one of the best customers of these 3 physical shops, in my town.

So sure, I don't buy the latest trends based on ads. I investigate a lot to buy GREAT stuff. Sometimes the shopkeeper has headaches to find the obscure stuff I discovered online that NOBODY knows it exists.

Am I an exception?

I don't know but those services are great to maintain a freedom of choice.

plastic-enjoyer · 10 days ago
No, I'm the same. A lot of stuff I read is hard-to-get philosophy or from obscure authors, so I first get them from Anna's Archive. Reading them on paper is much better so I try to find a physical copy later.
plastic-enjoyer commented on Brain cells learn faster than machine learning, research reveals   techxplore.com/news/2025-... · Posted by u/pseudolus
N_Lens · 11 days ago
We know that biological neural networks learn faster and orders of magnitude more efficiently than synthetic neural nets. The amount of data/stimulus required to teach a growing human language, motor, social and a variety of other skills, is tiny - when compared to the mass amounts of data required to train SOTA models today.

The question is are there techniques we can adopt from bio neural nets that can enhance the training speed and efficiency of synthetic neural nets?

plastic-enjoyer · 11 days ago
> The question is are there techniques we can adopt from bio neural nets that can enhance the training speed and efficiency of synthetic neural nets?

The results seem to indicate that the limiting factor is rather the hardware current-state AI is running on than the algorithms.

plastic-enjoyer commented on The U.S. grid is so weak, the AI race may be over   fortune.com/2025/08/14/da... · Posted by u/plastic-enjoyer
goda90 · 14 days ago
The much simpler explanation is that our leaders are focused solely on short term gains. They'll grift their way to them gladly, but investing in infrastructure that'll take years to build and won't be useful until they are gone is not interesting to them.
plastic-enjoyer · 14 days ago
> They'll grift their way to them gladly, but investing in infrastructure that'll take years to build and won't be useful until they are gone is not interesting to them.

I think this may have something to do with the professionalisation of politics, or the existence of career politicians. If you want to climb up the ladder in politics, working on short-term goals is probably the best way to do this. Infrastructure projects are high-risk, low-reward. Infrastructure projects may take a long time, may be reversed/aborted by the next government, may piss off potential voters, may require to fight off NIMBYs, or aren't noticed due to the preparedness paradox.

plastic-enjoyer commented on The U.S. grid is so weak, the AI race may be over   fortune.com/2025/08/14/da... · Posted by u/plastic-enjoyer
el_jay · 14 days ago
While there is certainly an argument to be made that many contemporary “Western” Pseudo-Christian Superempire nations face a crisis of short-termism, there are also ancient bits of “Western” infrastructure like the Roman aqueducts still in use today - off the top of my head, the Aqua Virgo which supplies Rome’s Trevi Fountain, dated either 19BC or 19AD, I forget; Spain’s Segovia Aqueduct from the first century AD; and the Pont du Gard in Nîmes, from the same period.

Not quite as old, or at the scale of the Dujiangyan system, but still evidence that the “Western” culture did once build for long term. Less ancient, but more indicative, are the European cathedrals built by multiple generations over a century.

plastic-enjoyer · 14 days ago
Good point!
plastic-enjoyer commented on The U.S. grid is so weak, the AI race may be over   fortune.com/2025/08/14/da... · Posted by u/plastic-enjoyer
ZeroGravitas · 14 days ago
Even in this article, it repeatedly refers to building out infrastructure in advance of an obviously approaching future need as "oversupply".

This is almost a cliche in reporting on China that seems to reflect a serious blindpsot in western media and/or business attitudes.

You can find plenty of articles complaining about "overcapacity" of battery factories in China even as they double in capacity and output each year.

Chinese electricity generation went from 4,000 TWh (the same as the US) in 2010 to double that in 2020. The US was basically the same after 10 years.

So a 100% "oversupply" in 2010 would be a zero percent oversupply within a decade given China's growth.

Most telling to me is that decarbonisation and electrification of transport and heating has long been known to require a doubling(!) of electricity production for developed nations (and a similar increase in developing nations where it gets hidden by other growth).

Apparently the US simply never had a plan to achieve that, and amazingly it still isn't part of the conversation around AI power. Instead they're just claiming the best parts of the existing power systems and passing the costs onto local consumers.

plastic-enjoyer · 14 days ago
> Apparently the US simply never had a plan to achieve that, and amazingly it still isn't part of the conversation around AI power. Instead they're just claiming the best parts of the existing power systems and passing the costs onto local consumers.

I wonder if this is more of a cultural thing, meaning Western cultures being more aligned to short-term gains instead of long-term gains. I mean, look at the Dujiangyan irrigation system that was build 2500 years ago and is still maintained until today. This isn't something the Western world would even consider.

plastic-enjoyer commented on Steam games no longer purchasable with PayPal in most countries   tomshardware.com/video-ga... · Posted by u/Shank
plastic-enjoyer · 14 days ago
Wouldn't surprise me if Valve will introduce their own payment processor in the near future
plastic-enjoyer commented on GPU-rich labs have won: What's left for the rest of us is distillation   inference.net/blog/what-s... · Posted by u/npmipg
ilaksh · 21 days ago
There is huge pressure to prove and scale radical alternative paradigms like memory-centric compute such as memristors, or SNNs, etc. That's why I am surprised we don't hear a lot about very large speculative investments in these directions to dramatically multiply AI compute efficiency.

But one has to imagine that seeing so many huge datacenters go up and not being able to do training runs etc. is motivating a lot of researchers to try things that are really different. At least I hope so.

It seems pretty short sighted that the funding numbers for memristor startups (for example) are so low so far.

Anyway, assuming that within the next several years more radically different AI hardware and AI architecture paradigms pay off in efficiency gains, the current situation will change. Fully human level AI will be commoditized, and training will be well within the reach of small companies.

I think we should anticipate this given the strong level of need to increase efficiency dramatically, the number of existing research programs, the amount of investment in AI overall, and the history of computation that shows numerous dramatic paradigm shifts.

So anyway "the rest of us" I think should be banding together and making much larger bets on proving and scaling radical new AI hardware paradigms.

plastic-enjoyer · 20 days ago
> There is huge pressure to prove and scale radical alternative paradigms like memory-centric compute such as memristors, or SNNs, etc. That's why I am surprised we don't hear a lot about very large speculative investments in these directions to dramatically multiply AI compute efficiency.

Because the alternatives lack the breakthroughs that give them an edge against current-state AI and don't generate the hype like transformers or diffusion models. You have stuff like neuromorphic hardware that is hardly accessible and in its infancy, e.g. SpiNNaker. You have disciplines like Computational Neuroscience that try to model the brain and come up with novel models and algorithms for learning, which, however, are computational expensive or just perform worse than conventional deep learning models and may benefit from neuromorphic hardware. But again, access is difficult to such hardware.

plastic-enjoyer commented on 6 weeks of Claude Code   blog.puzzmo.com/posts/202... · Posted by u/mike1o1
detaro · 25 days ago
While that path exists, the vast majority of developers don't go through that path.
plastic-enjoyer · 25 days ago
Yes, but it is more of a cultural thing than anything else. Studying computer science to be a software developer* is like studying mechanical engineering to be a machine operator.

* except if you are developing complicated algorithms or do numeric stuff. However, I believe that the majority of developers will never be in such a situation.

u/plastic-enjoyer

KarmaCake day86December 2, 2023View Original