Ethical approach? hell no. What do you expect from an unregulated capitalistic system.
Ethical approach? hell no. What do you expect from an unregulated capitalistic system.
My point is that you wouldn’t expect any one of them to be so much better than the others at crawling that it would give them an advantage. It’s just overhead. They all have to do it, but it doesn’t put any of them ahead.
> for a website owner there is zero value of having their content indexed by AI bots. Zilch.
Earning money is not the only reason to have a website. Some people just want to distribute information.
yes, I just want my hosting costs covered, and that is all. Otherwise you are paying for people to steal the info you "just want to share", the info the others make a profit on... that business model is absurd.
Thanks to the hard work of these organizations, the US market, at least, allows for those who purchased these games to continue to play them with a third party patch to their client. See below:
https://www.eff.org/deeplinks/2015/11/new-dmca-ss1201-exempt...
so hoping that a third party patch can fix the issue. hoping it does. sounds like a very hopeless "legal" workaround
I recently did for the first time. Spent 15 minutes writing a long prompt to implement a ticket. A repeated pattern of code, 5 classes + config per topic that deeply interact with each other and it did the job perfectly.
It convinced me that the current code monkey jobs, which are >90%, >95%? of software engineering jobs, will disappear within 10 years.
We‘ll only need senior/staff/architect level code reviewers and prompt engineers.
When the last generation that manually wrote code dies out, all people will do is prompting.
Just like assembler became a niche, just like C became a niche, high level languages will become a niche.
If you still don‘t believe, you haven‘t tried the advanced tools that can modify a whole project, are too incompetent to properly prompt or indeed work in one of the rare, arcane frontier- state-of-the-art niches where AI can‘t help.
the problem with that is that if there are no juniors left...
Google-Fu is being replaced with Prompting-Fu
not being allowed or choosing not to spend time learning the limits, benefits and drawback of different LLM models is basically handicapping yourself.
Crawling the web is not a competitive advantage for any of these AI companies, nor challenger search engines. It’s a cost and a massive distraction. They should collaborate on shared infrastructure.
Instead of all the different companies hitting sites independently, there should be a single crawler they all contribute to. They set up their filters and everybody whose filters match a URL contributes proportionately. They set up their transformations (e.g. HTML to Markdown; text to embeddings), and everybody who shares a transformation contributes proportionately.
This, in turn, would reduce the load on websites massively. Instead of everybody hitting the sites, just one crawler would. And instead of hoping that all the different crawlers obey robots.txt correctly, this can be enforced at a technical and contractual level. The clients just don’t get the blocked content delivered to them – and if they want to get it anyway, the cost of that is to implement and maintain their own crawler instead of using the shared resources of everybody else – something that is a lot more unattractive than just proxying through residential IPs.
And if you want to add payments on, sure, I guess. But I don’t think that’s going to get many people paid at all. Who is going to set up automated payments for content that hasn’t been seen yet? You’ll just be paying for loads of junk pages generated automatically.
There’s a solution here that makes it easier and cheaper to crawl for the AI companies and search engines, while reducing load on the websites and making blocking more effective. But instead, Cloudflare just went “nah, just pay up”. It’s pretty unimaginative and not the least bit compelling.
?? it's their ability to provide more up to date information, ingest specific sources, so it is definitely a competitive advantage to have up to date information
them not paying the content of the sites they index and read out, and not referring anybody to their sites is what will kill the web as we know it.
for a website owner there is zero value of having their content indexed by AI bots. Zilch.
build in camera apps are already superior...
Competition, fortunately
so there's no competition when there are no rules and regulations... ? interesting.
all those sports without rules or regulations, like american football where anything goes.