Readit News logoReadit News
zackmorris commented on Has the cost of building software dropped 90%?   martinalderson.com/posts/... · Posted by u/martinald
zackmorris · 7 days ago
*90% so far..

I've only been working with AI for a couple of months, but IMHO it's over. The Internet Age which ran 30 years from roughly 1995-2025 has ended and we've entered the AI Age (maybe the last age).

I know people with little programming experience who have already passed me in productivity, and I've been doing this since the 80s. And that trend is only going to accelerate and intensify.

The main point that people are having a hard time seeing, probably due to denial, is that once problem solving is solved at any level with AI, then it's solved at all levels. We're lost in the details of LLMs, NNs, etc, but not seeing the big picture. That if AI can work through a todo list, then it can write a todo list. It can check if a todo list is done. It can work recursively at any level of the problem solving hierarchy and in parallel. It can come up with new ideas creatively with stable diffusion. It can learn and it can teach. And most importantly, it can evolve.

Based on the context I have before me, I predict that at the end of 2026 (coinciding with the election) America and probably the world will enter a massive recession, likely bigger than the Housing Bubble popping. Definitely bigger than the Dot Bomb. Where too many bad decisions compounded for too many decades converge to throw away most of the quality of life gains that humanity has made since WWII, forcing us to start over. I'll just call it the Great Dumbpression.

If something like UBI is the eventual goal for humankind, or soft versions of that such as democratic socialism, it's on the other side of a bottleneck. One where 1000 billionaires and a few trillionaires effectively own the world, while everyone else scratches out a subsistence income under neofeudalism. One where as much food gets thrown away as what the world consumes, and a billion people go hungry. One where some people have more than they could use in countless lifetimes, including the option to cheat death, while everyone else faces their own mortality.

"AI was the answer to Earth's problems" could be the opening line of a novel. But I've heard this story too many times. In those stories, the next 10 years don't go as planned. Once we enter the Singularity and the rate of technological progress goes exponential, it becomes impossible to predict the future. Meaning that a lot of fringe and unthinkable timelines become highly likely. It's basically the Great Filter in the Drake equation and Fermi paradox.

This is a little hard for me to come to terms with after a lifetime of little or no progress in the areas of tech that I care about. I remember in the late 90s when people were talking about AI and couldn't find a use for it, so it had no funding. The best they could come up with was predicting the stock market, auditing, genetics, stuff like that. Who knew that AI would take off because of self-help, adult material and parody? But I guess we should have known. Every other form of information technology followed those trends.

Because of that lack of real tech as labor-saving devices to help us get real work done, there's been an explosion of phantom tech that increases our burden through distraction and makes our work/life balance even less healthy as underemployment. This is why AI will inevitably be recruited to demand an increase in productivity from us for the same income, not decrease our share of the workload.

What keeps me going is that I've always been wrong about the future. Maybe one of those timelines sees a great democratization of tech, where even the poorest people have access to free problem solving tech that allows them to build assistants that increase their leverage enough to escape poverty without money. In effect making (late-stage) capitalism irrelevent.

If the rate of increasing equity is faster than the rate of increasing excess, then we have a small window of time to catch up before we enter a Long Now of suffering, where wealth inequality approaches an asymptote making life performative, pageantry for the masses who must please an emperor with no clothes.

In a recent interview with Mel Robbins in episode 715 of Real Time, Bill Maher said "my book would be called: It's Not Gonna Be That" about the future not being what we think it is. I can't find a video, but he describes it starting around the 19:00 mark:

https://podcasts.musixmatch.com/podcast/real-time-with-bill-...

Our best hope for the future is that we're wrong about it.

Deleted Comment

Deleted Comment

zackmorris commented on Z2 – Lithographically fabricated IC in a garage fab   sam.zeloof.xyz/second-ic/... · Posted by u/embedding-shape
zackmorris · 8 days ago
This is great!

I started programming on an 8 MHz Mac Plus in the late 1980s and got a bachelors degree in computer engineering in the late 1990s. From my perspective, a kind of inverse Moore's Law happened, where single-threaded performance stays approximately constant as the number of transistors doubles every 18 months.

Wondering why that happened is a bit like asking how high the national debt would have to get before we tax rich people, or how many millions of people have to die in a holocaust before the world's economic superpowers stop it. In other words, it just did.

But I think that we've reached such an astounding number of transistors per chip (100 billion or more) that we finally have a chance to try alternative approaches that are competitive. Because so few transistors are in use per-instruction that it wouldn't take much to beat status quo performance. Note that I'm talking about multicore desktop computing here, not GPUs (their SIMD performance actually has increased).

I had hoped that FPGAs would allow us to do this, but their evolution seems to have been halted by the powers that be. I also have some ideas for MIMD on SIMD, which is the only other way that I can see this happening. I think if the author can reach the CMOS compatibility they spoke of, and home lithography could be provided by an open source device the way that 3D printing happened, and if we could get above 1 million transistors running over 100 MHz, then we could play around with cores having the performance of a MIPS, PowerPC or Pentium.

In the meantime, it might be fun to prototype with AI and build a transputer at home with local memories. Looks like a $1 Raspberry Pi RP2040 (266 MIPS, 2 core, 32 bit, 264 kB on-chip RAM) could be a contender. It has about 5 times the MIPS of an early 32 bit PowerPC or Pentium processor.

For comparison, the early Intel i7-920 had 12,000 MIPS (at 64 bits), so the RP2040 is about 50 times slower (not too shabby for a $1 chip). But where the i7 had 731 million transistors, the RP2040 has only 134,000 (not a typo). So 50 times the performance for over 5000 times the number of transistors means that the i7 is only about 1% as performant as it should be per transistor.

I'm picturing an array of at least 256 of these low-cost cores and designing an infinite-thread programming language that auto-parallelizes code without having to manually use intrinsics. Then we could really start exploring stuff like genetic algorithms, large agent simulations and even artificial life without having to manually transpile our code to whatever non-symmetric multiprocessing runtime we're forced to use currently.

zackmorris commented on Influential study on glyphosate safety retracted 25 years after publication   lemonde.fr/en/environment... · Posted by u/isolli
jeffwask · 10 days ago
Faking research data that then leads to the death of citizens from your product should result in a corporate death sentence.
zackmorris · 10 days ago
A mechanism for harm could be that glyphosate disrupts the gut lining barrier and flora, which can cause or contribute to leaky gut, a loose term for digestive waste and foreign bodies entering the bloodstream.

Those bodies can cause chronic inflammation and the strange autoimmune disorders we see rising over time. Note that some brands like Cheerios (which don't sell an organic equivalent) can contain 700-800 ppb of glyphosate, well over the 160 ppb limit recommend for children by the Environmental Working Group (EWG).

US wheat and other crops seem to have become harder to digest for some people due to genetic tampering. They contains substances borrowed from other species to reduce pest damage, which the body has little or no experience with, which may trigger various reactions (this has not been studied enough to be proven yet).

All of these effects from gut toxicity could lead to ailments like obesity, malnourishment, cardiovascular disease, maybe even cancer. This is why I worry that GLP-1 agonists may be masking symptoms, rather than healing the underlying causes of metabolic syndrome that have been increasing over time.

Many people have chosen to buy organic non-GMO wheat from other countries for this reason. I believe this is partially why the Trump administration imposed a 107% tariff on Italian wheat for example, to protect US agribusiness.

Before you jump on me for this being a conspiracy theory, note that I got these answers from AI and so will you.

My personal, anecdotal experience with this was living with leaky gut symptoms for 5 years after a severe burnout in 2019 from (work) stress, which may have been triggered by food poisoning. I also had extremely high cortisol which disrupted everything else. So I got to the point where my meals were reduced to stuff like green bananas, trying everything I could to heal my gut but failing, until I finally snapped out of my denial and sought medical attention.

For anyone reading this: if holistic approaches don't fix it within say 6 weeks to 6 months, they aren't going to, and you may need medication for a time to get your body out of dysbiosis. But you can definitely recover and return to a normal life like I did, by the grace of God the universe and everything.

zackmorris commented on Jony Ive's OpenAI Device Barred From Using 'io' Name   macrumors.com/2025/12/05/... · Posted by u/thm
arach · 10 days ago
is yo taken?
zackmorris · 10 days ago
zackmorris commented on RCE Vulnerability in React and Next.js   github.com/vercel/next.js... · Posted by u/rayhaanj
baobun · 12 days ago
> I also had the same experience with the magic convention over configuration in Ruby.

I'm not sure what this is a reference to? Is it actually about Rails?

zackmorris · 11 days ago
Ya I used Rails on an aging project for about 6 months and there was so much magic behavior that we couldn't effectively trace through the code, so debugging even the simplest issue took days. Also the happy path mostly ran fine, but we couldn't answer even the simplest questions about the code or make estimations when something went wrong, because we couldn't isolate the source of truth in its convention-dominated codebase.

I come from a C++ background and mostly use PHP/Laravel today, and even though it does things less "efficiently" than the syntactic sugar in Ruby or low-level optimizations in .NET, I find that its lack of magic makes for much higher productivity in the long run. IMHO it feels like Ruby solves the easiest problems with sugar and then glosses over the hardest problems like they don't exist. So I just can't tell what problems it actually solves.

Generally, I think that cleverness was popular in the 2010s but has fallen out of fashion. A better pattern IMHO works more like Cordova or scripting in video games, where native plugins or a high-performance engine written in a language like Swift or Rust is driven by a scripting language like Javascript or Lua. Or better yet, driven declaratively by HTML or no-code media files that encode complex behavior like animations.

Of course all of this is going away with AI, and I anticipate atrociously poorly-written codebases that can't be managed by humans anymore. Like we might need pair programming just to take a crack at fixing something if the AI can't. I'm always wrong about this stuff though, so hopefully I'm wrong about this.

zackmorris commented on RCE Vulnerability in React and Next.js   github.com/vercel/next.js... · Posted by u/rayhaanj
odie5533 · 12 days ago
For a single page of HTML, ArrowJS's site loads really slow. I sat for almost a full second on just the header showing.
zackmorris · 11 days ago
Yikes I didn't know that! I haven't actually used it yet hah.

For a bit of context, I come from writing blitters on 8 MHz Mac Plusses, so I have a blind spot around slowness. Basically, that nothing should ever be slow today with GHz computers. So most slowness isn't a conceptual flaw, but an inefficient implementation.

These alternative frameworks are generally small enough that it might be kind of fun to stress test them and contribute some performance improvements. Especially with AI, I really have no excuse anymore.

Edit: after pondering this for 2 seconds, I suspect that it's actually a problem with backend requests. It may have some synchronous behavior (which I want) or layout dependency issues that force it to wait until all responses have arrived before rendering. That's a harder problem, but not insurmountable. Also things like this irk me, because browsers largely solved progressive layout in the 1990s and we seem to have lost that knowledge.

zackmorris commented on RCE Vulnerability in React and Next.js   github.com/vercel/next.js... · Posted by u/rayhaanj
halflife · 12 days ago
Why does the react development team keeps investing their time on confusing features that only reinvent the wheel and cause more problems than solve?

What does server components do so much better than SSR? What minute performance gain is achieved more than client side rendering?

Why won’t they invest more on solving the developer experience that took a nosedive when hooks were introduced? They finally added a compiler, but instead of going the svelte route of handling the entire state, it only adds memoization?

If I can send a direct message to the react team it would be to abandon all their current plans, and work on allowing users to write native JS control flows in their component logic.

sorry for the rant.

zackmorris · 12 days ago
I couldn't agree more. I'll probably switch from React to something like ArrowJS in my personal work:

https://www.arrow-js.com/docs/

It makes it easy to have a central JSON-like state object representing what's on the page, then have components watch that for changes and re-render. That avoids the opaqueness of Redux and promise chains, which can be difficult to examine and debug (unless we add browser extensions for that stuff, which feels like a code smell).

I've also heard heard good things about Astro, which can wrap components written in other frameworks (like React) so that a total rewrite can be avoided:

https://docs.astro.build/en/guides/imports/

I'm way outside my wheelhouse on this as a backend developer, so if anyone knows the actual names of the frameworks I'm trying to remember (hah), please let us know.

IMHO React creates far more problems than it solves:

  - Virtual DOM: just use Facebook's vast budget to fix the browser's DOM so it renders 1000 fps using the GPU, memoization, caching, etc and then add the HTML parsing cruft over that
  - Redux: doesn't actually solve state transfer between backend and frontend like, say, Firebase
  - JSX: do we really need this when Javascript has template literals now?
  - Routing: so much work to make permalinks when file-based URLs already worked fine 30 years ago and the browser was the V in MVC
  - Components: steep learning curve (but why?) and they didn't even bother to implement hooks for class components, instead putting that work onto users, and don't tell us that's hard when packages like react-universal-hooks and react-hookable-component do it
  - Endless browser console warnings about render changing state and other errata: just design a unidirectional data flow that detects infinite loops so that this scenario isn't possible
I'll just stop there. The more I learn about React, the less I like it. That's one of the primary ways that I know that there's no there there when learning new tools. I also had the same experience with the magic convention over configuration in Ruby.

What's really going on here, and what I would like to work on if I ever win the internet lottery (unlikely now with the arrival of AI since app sales will soon plummet along with website traffic) is a distributed logic flow. In other words, a framework where developers write a single thread of execution that doesn't care if it's running on backend or frontend, that handles all state synchronization, preferably favoring a deterministic fork/join runtime like Go over async behavior with promise chains. It would work a bit like a conflict-free replicated data type (CRDT) or software transactional memory (STM) but with full atomicity/consistency/isolation/durability (ACID) compliance. So we could finally get back to writing what looks like backend code in Node.js, PHP/Laravel, whatever, but have it run in the browser too so that users can lose their internet connection and merge conflicts "just work" when they go back online.

Somewhat ironically, I thought that was how Node.js worked before I learned it, where maybe we could wrap portions of the code to have @backend {} or @frontend {} annotations that told it where to run. I never dreamed that it would go through so much handwaving to even allow module imports in the browser!

But instead, it seems that framework maintainers that reached any level of success just pulled up the ladder behind them, doing little or nothing to advance the status quo. Never donating to groups working from first principles. Never rocking the boat by criticizing established norms. Just joining all of the other yes men to spread that gospel of "I've got mine" to the highest financial and political levels.

So much of this feels like having to send developers to the end of the earth to cater to the runtime that I question if it's even programming anymore. It would be like having people write the low-level RTF codewords in MS word rather than just typing documents via WYSIWYG. We seem to have all lost our collective minds ..the emperor has no clothes.

zackmorris commented on Inflatable Space Stations   worksinprogress.co/issue/... · Posted by u/bensouthwood
nkoren · 20 days ago
Airship To Orbit is JP Aerospace, not Bigelow. It seems like an utterly bonkers and fairly implausible concept and I'm definitely not equipped to analyze its merits. But the JP team have some legitimate accomplishments in the rockoon world, and appear to be honest, hardworking people. Definitely not grifters. I've been following their work on ATO since they first announced it at a Space Access conference in ... 2003, I think? Still can't figure out whether it's real or not.
zackmorris · 16 days ago
Oh hah, thanks, I don't usually make mistakes like that! I guess the two were wired together in my brain.

u/zackmorris

KarmaCake day6052September 7, 2011
About
Everything always happens at once

https://www.linkedin.com/in/zack-morris-48996538/ http://stackoverflow.com/users/539149/zack-morris https://github.com/zmorris

View Original