Readit News logoReadit News
dang · 2 years ago
Comments moved to https://news.ycombinator.com/item?id=35617763, which was posted earlier and has the original source that this article points to.
dahwolf · 2 years ago
Yesterday Stackoverflow, today Reddit. A clear pattern emerges where open web content/communities face existential issues if the current AI paradigm continues.

It's a daylight robbery. The sum of 18 years of Reddit is an enormous capital investment as well an immeasurable amount of hours spent by its users to create the content.

It's absolutely baffling how a single entity (OpenAI, Google Bard) can just take it all without permission or compensation, and then centrally and exclusively monetize these stolen goods.

The fact that we barely even blink when this happens, and that founders confidently execute on an idea like this, tells you everything there is to know about our industry. It doesn't even pretend to do good anymore. Anything goes, really.

Anyway, get ready for an "open" web that will consist of ever more private places with ever higher walls. Understandably so, any and all incentive to do something on the open web is not only pointless now, it actively helps to feed a giant private brain.

charcircuit · 2 years ago
Google has been monetizing the comments the entire time. Reddit comments are in Google search results along side ads.
im3w1l · 2 years ago
Reddit users don't monetize their comments in the first place. So unlike say artists there is really no issue of them being outcompeted by a derivative of their own material.

gpt-style models are by far the least problematic and the most clearly beneficial of the new innovations.

This is a just a cashgrab.

dahwolf · 2 years ago
"So unlike say artists there is really no issue of them being outcompeted by a derivative of their own material."

Well, except for the tiny detail that the entire point of visiting or posting on Reddit becomes rather pointless.

hristov · 2 years ago
Good luck. But on a related note I think it is time that we all realize that the so called wisdom of AI is really the wisdom of millions of reddit posters, HN posters, github software uploaders and various posters on all kinds of forums all over the internet.
soneil · 2 years ago
That I find an interesting concept, because at the same time, they're trying to combat bot-generated posts. So how long until we have GPT 'learning' from its own output?
ephbit · 2 years ago
> GPT 'learning' from its own output

Interesting implication.

A system such as GPT learning from its own output is almost obviously absurd. A computer model getting fed its own output as training data feels like some kind of short circuit, or maybe just a circuit with a capacitor and resistor that simply slowly dissipates energy, until nothing happens/changes anymore.

Yet, isn't what humans do mostly the same, just more sophisticated?

Humans pick up experiences and references and information and, inspired from these, create "new" things. I'd argue that the "new" things humans create are really just new combinations/iterations of things that they picked up earlier in life.

The difference between GPT and humans is mostly: the human recombination system is vastly more complicated/sophisticated than its GPT counterpart and produces more chaotic output.

In the end, the _essential_ part of the, let's call it, cultural evolution, that's happening with humanity, maybe isn't so much the creation/recombination aspect (which produces this vast stream of new things, almost randomness), but rather the selection process. Lots of random things are being created all the time, most are being discarded, except for those that somehow appear valuable or get selected for other reasons.

The selection of things from the vast random stream might matter more than the process that creates the vast random stream.