Correct me if I'm wrong though.
With those structured numbers will the LLMs be 100% accurate on new prompts or will they just be better than chance (even significantly better than chance)?
Because this is one thing, it has to learn the structure and then create probabilities based on the data, but does that mean it's actually learning the underlying algorithm for addition for example or is it just getting better probabilities because of a narrowing of them? If it can indeed learn underlying algorithms like this that's super interesting. The reason also this is in an issue if it _can't_ learn those, you can never trust the answer unless you check it, but that's sort of a sidepoint.
1) When artists 'steal' from others, they generally build something 'new'. AI can't really create anything 'new'. Case in point: Umberto Eco's The Name Of The Rose steals the plot and outline from crime stories, steals the library idea from Borges, and steals the murderer's motive from Eco's medieval manuscript (forgot which one). Yet the outcome is something completely new. Same goes for hip-hop music; sampling is at its core, but the final music that comes out is nothing like what it samples, the samples are just a part of something new.
2) when artists steal from each other, it's generally poor artists ripping off other poor artists. No money can flow. When a million/billion-dollar company rips off poor artists to make more money via generative AI it's a different story. Money could flow but it doesn't.
To your second point, most of the work I've seen generated from these models were done for free. It could be argued that these tools add to an artists toolbox rather than take something away. I can see for example where one poor artist could create a computer game of the same quality that it takes a triple-AAA game company to do today. Is that good or bad?
Is it meaningfully harder than c++ in this regard?
<meta> is another question, it depends on string_view, vector, and possibly other parts. Maybe it's possible to make it leaner with more selective internal deps.