But that misses the fact that this is only the beginning. These models will soon generate entire worlds. They will eventually surpass human modeller capabilities and they'll deliver stunning results in 1/100,000th the time. From an idea, photo, or video. And easy to mold, like clay. With just a few words, a click, or a tap.
Blender's days are long in the tooth.
I'm short on Blender, Houdini, Unreal Engine, Godot, and the like. That entire industry is going to be reinvented from scratch and look nothing like what exists today.
That said, companies like CSM, Tripo, and Meshy are probably not the right solutions. They feel like steam-powered horses.
Something like Genie, but not from Google.
This is a pretty sweeping and unqualified claim. Are you sure you’re not just trying to sell snake oil?
No.
robots.txt is designed to stop recursive fetching. It is not designed to stop AI companies from getting your content. Devising scenarios in which AI companies get your content without recursively fetching it is irrelevant to robots.txt because robots.txt is about recursively fetching.
If you try to use robots.txt to stop AI companies from accessing your content, then you will be disappointed because robots.txt is not designed to do that. It’s using the wrong tool for the job.
const post = (url) => fetch(url, {method:"POST"})
Also, "fetch" is lousy naming considering most API calls are POST.
I would also use Yubikey for banking, but I am scared as f. what happens if I lose it while traveling abroad.