No they don't. It can also be that neither the government nor private parties give.
Making it an either/or often makes space for the individual to make excuses for why they don't share because out there somewhere there exists some government program that vaguely looks like charity.
Beware that you don't fall into the trap of thinking the 1% of the population that makes 90% of the noise on the internet is "significant" or a representative sampling of the population. Most everyone else's views are quite boring and detached from extremism, they just don't shout their moderation on the rooftops.
> If one exclusive group gets the benefit of developing AI with a 20% productivity boost compared to others, and they develop a 2.0 that grants them a 25% boost, then a 3.0 with a 30% boost, etc...
That’s a bit of a stretch, generative AI is least capable of helping with novel code such as needed to make AGI.
If anything I’d expect companies working on generative AI to be at a significant disadvantage when trying to make AGI because they’re trying to leverage what they are already working on. That’s fine for incremental improvement, but companies rarely ride one wave of technology to the forefront of the next. Analog > digital photography, ICE > EV, coal mining > oil, etc.
Then it looks like Company A spends 90% of time on novel research work (while LLMs do all the busy work) and Company B spends 5% of time on novel research work.
You have to be careful with that word, peace, because all wars are defensive.
The intersection of the two seems to be quite hard to find.
At the state that we're in the AIs we're building are just really useful input/output devices that respond to a stimuli (e.g., a "prompt"). No stimuli, no output.
This isn't a nuclear weapon. We're not going to accidentally create Skynet. The only thing it's going to go nuclear on is the market for jobs that are going to get automated in an economy that may not be ready for it.
If anything, the "danger" here is that AGI is going to be a printing press. A cotton gin. A horseless carriage -- all at the same time and then some, into a world that may not be ready for it economically.
Progress of technology should not be artitrarily held back to protect automateable jobs though. We need to adapt.
Does the current AI give productivity benefits to writing code? Probably. Do OpenAI engineers have exclusive access to more capable models that give them a greater productivity boost than others? Also probably.
If one exclusive group gets the benefit of developing AI with a 20% productivity boost compared to others, and they develop a 2.0 that grants them a 25% boost, then a 3.0 with a 30% boost, etc...
The question eventually becomes, "is AGI technically possible"; is there anything special about meat that cannot be reproduced on silicon? We will find AGI someday, and more than likely that discovery will be aided by the current technologies. It's the path here that matters, not the specific iteration of generative LLM tech we happen to be sitting on in May 2025.
> Instead of our current complex capped-profit structure—which made sense when it looked like there might be one dominant AGI effort but doesn’t in a world of many great AGI companies—we are moving to a normal capital structure where everyone has stock. This is not a sale, but a change of structure to something simpler.
― Sun Tzu
This concern is virtually unheard of today, and I wouldn't be surprised to learn that they actually had a slight effect in the opposite direction: some of those youth getting trouble outside are now indoors playing harmless video games.