An example I can think of was when Eric Lippert, a developer on the C# compiler at the time, responded to a question about a "gotcha" in the language: https://stackoverflow.com/a/8899347/10470363
Developer interaction like that is going to be completely lost.
Think of the cost savings!
Why don't they just keep their findings to themselfes and build products on top of them?
Public companies can't do stuff just for the fun of it, right? So there must be some commercial reasoning behind it?
A different metric is a more relevant goalpost -- number of synapses. If each of 125 trillion synapses in the brain can adjust its strength independently of others, it loosely corresponds to a parameter in a neural network. So if we get 100 trillion parameter networks training but still no human intelligence, we'll know conclusively that the bottleneck is something else. Currently training 1T parameter networks seem feasible
Then they got extended to detect writing systems: https://i.stack.imgur.com/ed1Co.png