Readit News logoReadit News
naet · 2 years ago
AI feels different to me because it's "retroactively" extractive.

Social media may be extractive if I give it my attention and data to resell. Crypto is extractive to the people buying into it and losing money. But I can avoid doing those things if I want.

AI these days takes anything I've written (from literature to code) or illustrated, and retroactively extracts value from it to be repackaged and redistributed. Even if I was proactive about securing the trademark or copyright, they can seemingly just ignore it without consequence, without me ever participating or opting in to it.

It feels massively different to have social media extract value from and image I uploaded willingly, vs AI extracting value (eg making things "in the style" of me) after training on my work against my will.

Deleted Comment

dantheman · 2 years ago
You mean like how you learned from others? Against their will?
Nevermark · 2 years ago
Scale has a quality all its own.

You create a highly useful website full of dense hard earned knowledge. Some people will just use that knowledge, but a few create their own sites, using what they learned.

A company sweeps up hard earned knowledge from millions of sites, with automation but not permission, and goes a long way to making those millions of informational sites redundant without any compensation.

Simplifying things obviously. But these two scenarios are not equivalent.

jj999 · 2 years ago
Why can't we agree there are differences between other human beings learning from other people and a multinational corporation learning from everyone at industrial scale, with industrial means?
southernplaces7 · 2 years ago
Truly, your reply is ridiculous in its comparison. How you could fail to see the difference between these two things, their scales and also their qualitative differences is impressive.
GaryNumanVevo · 2 years ago
There's a clear qualitative difference. If I post a lecture on mine on YouTube and a million people watch it, great! If a training company takes my lecture without my permission and re-sells it to a million people, that's a clear distinction.

(Based on a true story, it took me 3 years and 2 law firms to recuperate royalties for IP theft)

a_wild_dandan · 2 years ago
Yeah, I'm unsure why learning without author consent is judged differently between humans & AI. Some folk are justifying the distinction "because scale", as if that explains anything. Why would scale matter? And what's the threshold for scale that turns freely learning into theft? I'm not being facetious; I genuinely don't see the connection.
sebmellen · 2 years ago
There’s a shocking lack of introspection in the AI hype bubble — we are continually promised world-changing advances, but no one is very clear about which direction these advances are pushing us in. Even ‘AGI’ is nebulous and undefined at best.

Where is the focus on the foundational principles of growth? The author writes that endless growth is impossible, but that is only true in the absence of novel breakthroughs. We need creations and inventions which bring us the future we once imagined: limitless energy and abundance.

How have we completely lost focus on new physics or foundational sciences and devoted the smartest minds of our time to transformers and data scraping? There is some critical human component which is completely absent here.

Where is the renaissance, and how do we make it happen?

dotnet00 · 2 years ago
I think part of the issue is that a lot of the 'big' advances in physics nowadays aren't constrained by brain power as much as they are by cost (or rather, money allocation), and due to the corruption of the scientific funding system in most rich countries, waste is extremely high, severely limiting the money actually going into science. Eg to make radical progress in physics, we need more sensitive telescopes, larger particle accelerators, more efficient space missions, more actual iteration on fusion reactors etc.

All these things cost a lot of money, yet the majority of it gets eaten by systems designed to minimize the money actually going towards the research work. This isn't even really getting into how the researchers are paid.

I agree that AI researchers are concerningly blasé about what they're actually aiming to do. AGI is poorly defined, and the only negative externalities they seem to pay lip service to is culture war stuff. There's near zero consideration for the real negative externalities they're perpetrating upon the world. I think partly this also manifests in how they interact with fields where even older AI developments would be useful. At the physics lab I work at, I've found that many scientists have a negative impression of AI 'researchers', where they don't care to understand the problem they're trying to solve, preferring to just treat everything - even things where the solution space can be constrained by scientific understanding - as a black box. It's almost the opposite of science in approach.

timeforcomputer · 2 years ago
I am studying physics now (with a CS and math background) and I feel obligated to get up to date on AI and to develop a good working philosophy of how it can be meaningfully used in scientific work. Not that all "apply machine learning to X" approaches aren't interesting, but I lack enough understanding to know whether these are popping up everywhere because people feel obligated to apply new methods. For example, the Fourier transform is deep and interesting and there are libraries and standards and ways to transform different objects over clusters, etc., but I wouldn't say good scientific research is about finding a place to apply a Fourier transform (maybe :)). I am new to this though.
sebmellen · 2 years ago
It’s still surprising to me that not one decabillionaire has funded a good basic sciences institute. With a few hundred million dollars (paid primarily to salaries) and an extremely rigorous selection process, one could establish a highly prestigious research university and bring the brightest minds together in one place. Just go from there!

If Leland Stanford could do it…

renegade-otter · 2 years ago
AI is just evolution of Big Data. It's not transformative, like the Internet was. The Internet was a new medium.

A more advanced Hadoop will not change your life. It will not change the way you read, listen, watch, communicate (fundamentally). The Internet did.

Especially at a time when the "legacy" internet is falling apart, a new way to find information is rather a necessity than some breakthrough.

The "good" AI needs to fight it out with the "bad" AI before we see if the net positive is even there. A podcast can be translated by AI into a different language, you say? The price is super-realistic fraudulent phone and video calls from your "family".

tomrod · 2 years ago
I'd argue this take is a bit dated, as if you're looking at ML/AI in terms of training of data.

If you look at the enablement of features, AI is a new (and, currently, often flaky) medium.

It would be analogous to looking at the Internet as limited to making it simpler to order from a catalog over the phone. While technically true, there is more that it can do besides.

Much like we don't use copper in our walls for dialup anymore, I don't anticipate the initial architectures of LLM and MoE to last terribly long, but they do enable proving that the concept works.

kingraoul · 2 years ago
There’s another way to look at this - as the beginning of the construction of the Data Mesh: https://martinfowler.com/articles/data-mesh-principles.html

This can be seen as a “shift left” in data capabilities. The rise of Data Products ultimately democratizes access to data.

tazu · 2 years ago
> How have we completely lost focus on new physics or foundational sciences and devoted the smartest minds of our time to transformers and data scraping?

I'm not smart enough to be a physicist, but I like listening to Eric Weinstein[1]. He thinks string theory is essentially a dead-end honeypot doing exactly what you describe with our smartest minds.

[1]: https://youtu.be/eOvqJwgY8ow

mianos · 2 years ago
Sabine Hossenfelder just did a talk on this. I wonder if the smartest AIs will go down the same rabbit-hole and chew up a billion CPUs in a loop?
sebmellen · 2 years ago
If Weinstein could tone down his conspiratorial edges and focus on the primary substance of what he’s saying, I think he would be more effective at achieving his goals.

I like his podcast appearances — they are fun to listen to — but the solution to political machinations destroying established institutions is not to focus on the politics! We need to escape that frame entirely, and focus instead on building new institutions that are sufficiently reverent of smart minds and brilliant people.

Yes, we need new physics! We get there by escaping the current career trap which stops brilliant people from trying new approaches. Give the top, boldest, most daring researchers an alternative to tenure — $10m vested in a secure position at a new research institute. Then they won’t have to be scared of string theory boogeymen.

tim333 · 2 years ago
Tech innovation doesn't really happen through grand planning and "focus on the foundational principles of growth." It happens through lots of people trying to make stuff.

Many people think about the diretion this is going but tend to get dismissed as singularuty cranks if they think far enough ahead.

Deleted Comment

protocolture · 2 years ago
People are working hard in all fields. Just because you see lots of AI news doesnt mean its the only thing being done.

Dead Comment

KuriousCat · 2 years ago
This is the consequence of giving up privacy and agency very easily. At some point universities stopped encouraging independent exploration and started minting out industry ready vocational humans eager to join the rat race. That needs to be fixed but at this point there are not many direct beneficiaries who would want to address that problem so this is going to get worse for a while.
vkou · 2 years ago
> At some point universities stopped encouraging independent exploration and started minting out industry ready vocational humans eager to join the rat race.

Universities absolutely encourage independent exploration.

Every year, they mint ~10x more grad students than there are research professor positions for them.

You want universities to produce even more of them?

anonzzzies · 2 years ago
After social media anything goes; the rest are just extensions of the mind control social media made possible. Crypto nor AI would’ve been the hype they are without the current depressing iteration of social media and popular ‘influencers’ promoting their ‘businesses’ to somehow legally take your money. Of the only communities still out there that have some sort of free thought, like HN, even on HN, people are defending TikTok for being ‘great for discovery’. Even though it’s likely a Chinese state weapon and responsible for controlling a billion+ people their every waking (and probably therefore sleeping) moments. With seemingly smart people defending garbage like that, what does the rest matter? Who is going to listen to reason while scrolling through endless empty money grabs (which somehow they believe are adding actual value to their lives)?

There is an article on the HN homepage about leaded petrol lowering IQ; I bet that in 100 years TikTok etc will be considered far more detrimental than that.

xvector · 2 years ago
I've come across countless great artisans and small businesses I would have never discovered otherwise, all thanks to social media. I've spent tens of thousands of dollars on products from these small businesses.

I will go so far as to say that millions of small businesses exist because social media provides them with global reach. As an example, if you forge, say, artisan chefs knives made out of meteorite, no one in your town/village/etc likely cares, but there are thousands of people worldwide that will gladly buy from you.

TikTok/Reels is simply the next evolution in information condensation. It's entertaining but also educative. I might not care to spend 20 minutes learning about the intricacies of, say, forging Damascus steel, but I'll gladly watch a couple TikToks on it.

sebmellen · 2 years ago
The problem with condensed information is that it is extremely lossy. Sure, you may learn something with the bite-sized entertainment offered by a TikTok video or Instagram Reel, but the deeper substance is not there. It’s like saying that by eating 200 individual cheerios, or maybe 200 individual pieces of all kinds of cereal, you’ve eaten a full meal. But you haven’t, really.

Learning and understanding requires expending some kind of effort, and easy access to condensed information actually precludes you from experiencing that. If we don’t digest what we learn we are no better than LLMs — mindlessly regurgitating small bits of non-integrated information back and forth to each other.

timeforcomputer · 2 years ago
TikTok is another everything platform. If I used it, I imagine I could take any stream of thought and somehow warp it into a reason to use TikTok. It is very easy to pretend that the platform doesn't matter and it is just a carrier to provide the opportunity to engage in new thoughts or interactions. I made a huge mistake allowing reddit to be my go-to. I did not think twice about going from thinking "I want to study insects" to reading the top 500 posts of all time for insects hobbyists, learning almost nothing except apparent insider-knowledge and insect drama, feeling burned out, and continuing on from that thinly veiled meme-cynicism-comment-meme-article-comments new-tab new-tab new-tab cycle (in the context of insects) to the rest of the reddit machine. Maybe something is wrong with my brain but I'm going to assume that TikTok is actively trying to draw in every thought and intention into it's algorithm, and leave someone scrolling TikTok rather than continuing with whatever it is they intended to do.
IMTDb · 2 years ago
First sentence of the article: "It's easy to pick on AI, because, well, it's costing a whole lot and providing, at best, dubious benefits."

"At best dubious benefits" ??? I feel like there is a whole essay missing there, because my analysis of the situation is that it provides at minimum minor benefits to million of people using it on a daily basis, and at best it saves lives to some of them.

That it not to say there is 0 harm anywhere; we definitely need to do some cost/benefit analysis. But the initial premise is so obviously flawed that it's hard to consider the rest of the arguments.

dbtc · 2 years ago
Can you give some examples?
data-ottawa · 2 years ago
LLMs/ChatGPT is saving me a lot of time, and it’s actually a very good tool to find a jumping off point for research.

Copilot saves me quite a bit of time writing documentation and boilerplate.

For hobbies AI is nice because I can ask my beginner questions and get results, where often you end up on low quality blogs or reddit threads arguing about things you shouldn’t bother thinking of when starting out.

protocolture · 2 years ago
I built a pretty complex home automation system using a stack of raspberry pis and python code that largely came out of GPT4 and copilot. I had no previous python experience. Just came up with the idea as a way to get GPT4 to teach me something.
plagiarist · 2 years ago
I use AI instead of a thesaurus, it's pretty adequate.
Eisenstein · 2 years ago
I ask GPT things all the times and it tells me how dangerous they are. Like, everything I ask there is a warning about how I need to be careful doing it.

Maybe someone somewhere asked GPT how to cook pasta and had no idea boiling water was hot, and GPT's warning not to get boiling water on themselves saved them from massive burns?

Deleted Comment

protocolture · 2 years ago
This. I think the problem is that its hard to say that its increased all performance by 50%. But every day I hear of a new use case that's turned a single persons life around completely.
joe_the_user · 2 years ago
I know a number of people who find ChatGpt convenient and even fun. But I don't know of "life changing" instances. What are these instances?

I mean, the phrase sounds like "I started my business with ChatGpt", which is true of all the recent trendy things and so would be less unique than people imagine or "ChatGpt cured my depression", where someone really shouldn't do that.

nonsensikal · 2 years ago
This article forgets the thesis because who cares? It's just disposable trash trying pattern match on "AI bad." The whole point is just to nod along.
anonu · 2 years ago
> everything is lately.

Not lately - everything is extractive and has always been. This is the very foundation of economic development and creation. If we were against all extractive things then nobody would work for a living and our entire society would crumble.

MarcelOlsz · 2 years ago
Or maybe we'd just be dancing around fires fornicating and singing and hunting as intended. Wish "cities" were strictly medicinal hubs and the rest free to roam.
mianos · 2 years ago
The idea of abandoning cities to frolic in the wilderness is naive and disconnected from historical reality. For most of human history, life was defined by grueling labor, scarcity, disease and servitude for all but a privileged elite.

Modern development, for all its flaws, has brought immense progress in living standards and quality of life that shouldn't be romanticized away.

idle_zealot · 2 years ago
That sounds terrible. I'm glad most of our ancestors disagreed with you.
dotnet00 · 2 years ago
I see that the myth of the noble savage continues to live on.
ohyes · 2 years ago
This is a good insight, but it needs to scream the point louder. AI in the form of an LLM is a tool. We’re doing extractive things with all of these other tools as well. It is not AI that is the problem, we are the problem.
1vuio0pswjnm7 · 2 years ago
https://www.ipsos.com/sites/default/files/ct/news/documents/...

Only 37% of people surveyed in the US believe AI has more benefits than drawbacks

AI is curiously well-received in countries with high levels of corruption

grogenaut · 2 years ago
Is there anything in humanity or the universe that isn't extractive at some level? Machinery, used oil or metal. Hell if you go all the way to solar or wood, you're still extracting energy from the sun which is extracting extracting energy from fusion.

Everything constructive is just refining entropy into something.