Readit News logoReadit News
avalys · 2 years ago
Do grocery stores “capitalize on vulnerability” when they place name-brand products at eye level?

Do carmakers “capitalize on vulnerability” when they advertise pickup trucks as big tough vehicles for tough, outdoorsy men?

Do providers of health insurance for pets “capitalize on vulnerability” when they say you need to buy their product if you love your pet?

At some point people need to be responsible for their own decisions. And I can’t get that worked up about Meta’s free product.

quacked · 2 years ago
Yes, yes, and yes. However, the three situations you named are targeting adults. Children and teenagers are worth insulating from the full might of advertisers, propagandists, and always-on virtual social circles.
ff317 · 2 years ago
Arguably, adults are worth protecting as well. A lot of marketing and even design intent for adult products is also extremely misleading and intentionally designed to prey on our collective fallibility. In that sense, it's at the very least unethical, and maybe should be illegal. We have some laws about truth-in-advertising, but they're very weak in a lot of scenarios.
hf67 · 2 years ago
Even Adults are fucked when info explodes. Cuz there are upper limits to what a 3 inch chimp brain can do.

It's been studied under so many different names Knowledge Gap Theory, Info Asymmetry, Bounded Rationality etc The more info Adults have to digest, bad decisions/exploitation are garaunteed.

AmericanChopper · 2 years ago
I would argue that it is the responsibility of parents to decide how to do that. But if you want a legislative solution, there already is one (COPPA) which prevents people under the age of 13 from joining social media.

If you think those legal protections aren’t fit for purpose (they were created long before social media even existed), then you should take that up with your legislators. I personally wouldn’t trust them to approach that task without implementing something horribly tyrannical, like implementing a requirement for a full KYC process for creating social media accounts. So I’d advise that you be careful what you ask for in that respect.

alakin · 2 years ago
Are children and young adults banned from selecting products based on marketing inside stores?
bilbo0s · 2 years ago
Here is the issue of with these kinds of things, where does it end?

Do tv shows target teens too much with high energy music and dancing?

Does media target them too closely with intentionally addictive music riffs from Taylor Swift to Billy Eilish?

Will we shut down all these video games that clearly target kids with bright colors and, let's call it what it is, "aestheticized violence"?

We need to be careful about how we go about trying to protect children in this regard.

bdzr · 2 years ago
Ironically, many cereals are placed at eye level of children.
megaman821 · 2 years ago
Are you saying the cereal boxes on display at grocery stores are mostly targeting adults?
aaroninsf · 2 years ago
At some point,

we as a society are going to have to seriously engage the fact that we are now fully capable of manufacturing addiction, and at the moment, do so, both in adults and in children.

"Their own decisions" is not a stable concept. Setting aside esoteric philosophy of mind, you need look no further than your own relationship to your phone—tested out for many of us at the Thanksgiving table last week, as duly noted by Chris Ware's cover of last week's New Yorker magazine—to confirm this.

The mechanisms of surveillance captialism and a foundation of decades of consumer psychology (etc ad nausuem) have quite literally left us adrift in a world of stochastic mind control. At that same table many of us encountered the inexplicable world views of relatives whose propoganda bubbles did not intersect our own.

And we all have such bubbles, not least as a result of the cheerful professionalism of many who browse here.

Your decisions, just like teenagers' decisions, are not "your own" in the sense someone might have meant c. 1923. And before one cries, it has ever been thus, to that I say: no, it absolutely has not. The technologies for behavioral steering of today are as unalike what people contended with in advertising (etc.) a hundred years ago, as our logistic transport and energy industries are, amongst others.

Until we take this on, head on, as a society, the problem will just get worse.

BizarreByte · 2 years ago
> At some point people need to be responsible for their own decisions.

You're asking this of a group that largely can't vote, sign contracts, and which America doesn't trust to drink.

I can get worked up about Meta targeting children in ways which they don't have the experience, or knowledge to know about let alone to avoid. Children should be protected from bad actors like Meta and let me be clear, any company taking advantage of kids is a bad actor.

richardwhiuk · 2 years ago
Companies take advantage of kids all the time - there's a huge history of this - https://en.wikipedia.org/wiki/Advertising_to_children

The entire toy unboxing industry is built around advertising to children.

Deleted Comment

switchbak · 2 years ago
Do grocery stores “capitalize on vulnerability” when they ... place junk food at the checkout line to capitalize on your impulsiveness when you're least able to defend against it?

Yes.

You can be expected to make responsible decisions as an adult. That doesn't mean that there aren't bad actors trying to take advantage of you, and that this behaviour isn't borderline unethical.

ff317 · 2 years ago
I should make responsible decisions as an adult, but it's still fair to call that grocery store's actions unethical, and that maybe they should be regulated if they can't be ethical on their own.
JumpCrisscross · 2 years ago
> At some point people need to be responsible for their own decisions

This is like saying “everything in moderation” in a discussion about nutrition. No shit. We’re trying to find that delineation.

Kapura · 2 years ago
>At some point people need to be responsible for their own decisions.

That point should come when they are no longer children. Targeting children to produce perfect little ecosystem consumers is... kinda evil.

Levitz · 2 years ago
>And I can’t get that worked up about Meta’s free product.

There is no "Free" product. You are paying with freedom, you are paying with attention, you are paying with privacy. It's not "free", it's extracting value from you.

It doesn't cost money, yes, but neither does working, yet we assume that transfer of value is such that it ought to be paid for. It's not Meta offering a "free" product. It's their users. Their users give Meta their data for "free", which then Meta uses for profit.

maximinus_thrax · 2 years ago
> At some point people need to be responsible for their own decisions.

We already do that. When they turn 18, we expect people to be responsible for their own decisions.

hotnfresh · 2 years ago
One normal human with 24 hours in a day losing 45ish hours a week to pull median income and another 8ish per day to sleep versus multibillion dollar companies hiring behavioral psychologists and marketing experts with collectively many thousands of hours per day spent finding ways to trick people—and their efforts demonstrably work.

The advertising industry’s a rabid dog the size of Godzilla and should be put down, whether it’s targeting kids or adults.

marcosdumay · 2 years ago
Not only I'll add to the choir of people answering "yes" to all those 3, but if any of those act in a way capable of redefining the reality people live in, they should be outlawed. Even if they are targeting adults.

Marketing gets a lot of freedom because of the assumption that they only take over a small part of the information a person has access to. To the extent that this assumption becomes incorrect, those actions become attacks.

AlexandrB · 2 years ago
Yes to all of the above? Advertising is a nasty industry and should be tightly controlled.
LeroyRaz · 2 years ago
There are a great number of regulations placed on products (and their advertising) to ensure customers are informed and protected.

Examples: Drinking disclaimers (drink responsibly). Cigarette disclaimers and off putting mandated packet visuals. The traffic light system in the UK (which displays a colour coded breakdown warning of unhealthy food macros on the front of all ready meals). Alcoholic beverages by law having to specify their alcohol percentages. Foods by law having to specify their nutritional content and ingredients.

All of these regulations have been introduced to ensure customers are not blind to unhealthy choices (e.g., the traffic light system warning against high sugar content designed to make cheap addictive food). While not always effective, I believe that on balance these regulations make society a better place to live in. Similarly one could envision mandated social media disclaimers and warnings, and to regulate this way would be entirely within the wider norm, rather than something unusual.

skeaker · 2 years ago
Yes, yes, and yes. Advertising is almost always malicious. The days of it being primarily small businesses getting the word out about themselves are long over. Advertising firms now have actual psychologists that study ways to best exploit the brain of the common man, and that is wrong.
kibwen · 2 years ago
Hear hear. It's bizarre to see people like the parent commenter who are so unthinkingly accustomed to abusive and manipulative advertisements that they mistake its perversity for normalcy.

Dead Comment

Curvature5868 · 2 years ago
The dilemma parents are grappling with is this: tablets and smartphones, while beneficial for children's learning and socializing, also expose them to constant marketing and propaganda, even within the confines of their bedrooms, as they attempt to connect with peers or complete tasks.

Previously, children's exposure to marketing and propaganda was mostly confined to their entertainment hours, during which they watched television or read magazines. There was at least some hope for moderation. However, "apps" have blurred these boundaries, as the same devices used for education and social interaction are also channels for persistent advertising and messaging, making it harder to limit exposure to just "entertainment" time.

lossolo · 2 years ago
> The dilemma parents are grappling with is this: tablets and smartphones, while beneficial for children's learning and socializing

Recent reports from teachers indicate that many children are intellectually behind their peers. A concerning trend is that these children struggle to hold conversations, a problem attributed to their parents phone/social networks addiction. Rather than engaging and raising their children through conversation and interaction, these parents often resort to pacifying them with tablets or phones.

LargeTomato · 2 years ago
What do you mean they struggle to hold conversations? I was an awkward kid and you could say I struggled to hold conversations but it wasn't due to an addiction to tech. I'm also socially well adjusted now, as an adult.
LorenPechtel · 2 years ago
Yeah. I think this is the real problem. Electronics let parents slack off on parenting but electronics do not replace socialization.
LesZedCB · 2 years ago
> while beneficial for children's learning and socializing

citation needed? or are we just assuming because, well, there's education and social information and apps available on them?

gwbas1c · 2 years ago
> citation needed

This isn't Wikipedia. It's a casual internet forum, and you don't need someone to come armed with mountains of proof for casual (and obvious) statements.

BTW: One of my kids learned to read by playing a Cookie Monster word game during the pandemic. We've had enough "edutainment" software for a few decades that you don't need to ask for proof in a casual atmosphere.

raccoonDivider · 2 years ago
It's definitely harder to socialize when all your classmates have smartphones and you don't, but you mostly need access to personal messaging apps. TikTok or a Facebook/Twitter feed full of people you don't know IRL are where the problems come from and they aren't needed. If only there was a way of splitting those out into separate apps.

Basically we need more things like Facebook's push a few years ago to show more personal updates from close friends and less mass-shared political posts from organizations.

Curvature5868 · 2 years ago
I think it is hard to argue that you can't learn anything or socialize on your phone.
HDThoreaun · 2 years ago
education is debatable, but I don't think there's any argument to be made that social life isnt degraded without a phone. High schoolers won't be invited to things if they don't have access to a smartphone.
splitwheel · 2 years ago
There is science driving the design of products to make them addictive.

For teen girls - the apps are designed to scare them about being socially excluded. For teen boys - the apps are designed to fill their need to master skills.

The issue that the government has to deal with with app addictions is self harm attempts by girls (e.g. emergency room visits) and underperformance of boys in the real world (e.g. low college enrollment).

If you are trying to make an addictive app, this is a good reference to understand the science: https://www.amazon.com/Hooked-How-Build-Habit-Forming-Produc...

BJ Fogg is a good reference too: https://www.bjfogg.com

EGreg · 2 years ago
There's a good article about how to fix it: https://www.laweekly.com/restoring-healthy-communities/

(Disclaimer: it talks about my work)

anthk · 2 years ago
>For teen girls - the apps are designed to scare them about being socially excluded.

Any female magazine ever.

wussboy · 2 years ago
Agreed, but I do think the effects of the addiction are radically different between a social media app and a magazine.

Deleted Comment

antiviral · 2 years ago
Yes, and moreover, the important point in the article that some people seem to be forgetting is that Meta itself believed that certain design choices led to addictive products and worked to incorporate those designs despite harmful consequences to children and adults alike. It matter much less that anyone on the outside believes this or not.

Additionally, saying that children and adults should be wholly responsible for this is like saying the Chinese and not the British should be responsible for their opium addiction (see Opium War) and that homeless in San Francisco should be responsible for their Fentanyl addition. They can always just say no, right?

I worry that if nothing is done, this will only get worse, addiction will become the norm, of one sort or another, and you can just look at history of the Opium War to see where this leads.

jocaal · 2 years ago
> Additionally, saying that children and adults should be wholly responsible for this is like saying the Chinese and not the British should be responsible for their opium addiction (see Opium War) and that homeless in San Francisco should be responsible for their Fentanyl addition. They can always just say no, right?

This is why I find it funny that FAANG people call themselves software engineers. In the real world, an engineer is wholly responsible for the projects they bring into the world. Imagine a bridge collapses and someone dies. Then in court the family is told that the person was responsable to research bridge designs before using it. These social media companies are just run by money hungry a-holes.

superkuh · 2 years ago
This is what happens when you start using the word "addiction" outside of contexts where it applies. You get these kinds of invalid and dangerous arguments comparing actually addictive substances that hijack incentive salience directly on the physiological level to a screen and speakers that most definitely do not.
superkuh · 2 years ago
There is a for-profit pseudo-science, much like the anti-gay camps of the 1980s, which is spreading unsupported claims using words like "addiction" in contexts where the medical regulatory bodies and journal literature don't believe the concept applies. These people prey on the irrational behavior of parents scared for their children and try to convince them that things like addiction to a website on a screen is possible. They write popular press books, go on talk shows, etc, to keep the meme (and their funding sources) alive. But the DSM and ICD just don't support it. Neither do the recent literature; at least if you stay out of the pay for publish 3rd tier "journals" these scammers submit their "science" to. And yes, it even applies to media personalities associated with Stanford.
erellsworth · 2 years ago
>> These people prey on the irrational behavior of parents scared for their children and try to convince them that things like addiction to a website on a screen is possible.

Saying that addiction to a website isn't possible is unfounded.

People get addicted to online gambling. That's just "a website on a screen." It's clearly possible and it clearly happens.

dotandgtfo · 2 years ago
What are you stating? I genuinely don't get the point. Are you saying that screens/apps don't cause addiction?
verisimi · 2 years ago
My concern would be more that it is entraining a sort of consumerist outlook, where corporate values are instilled into a child, rather than addiction. That has always been the case of course, with education preparing the new generation for the workforce. But the use of technology disintermediates the parent from that process.
4death4 · 2 years ago
Doesn't BJ Fogg work at Stanford?
ericra · 2 years ago
Please correct me if I'm wrong, but it's my understanding that Meta (and most other big tech companies) have long been in the business of hiring a large number of recent social science Ph.D. graduates from top U.S. universities. People with a lot of knowledge of statistics and some domain-specific knowledge in their fields that could possibly be applicable to their job. The whole purpose of doing this is to create teams of marketing people doing in-house research to figure out how to best manipulate others by maximizing "engagement" or whatever other metric.

Isn't this just how all big tech companies operate as a normal business practice? Certainly Youtube is no better when it comes to targeted content and advertisements to children to their detriment.

My main point is that I don't think it makes any difference whether Meta has some internal document proving that they specifically target children with these practices. The problem is so much bigger than a single policy or company, and legislatures need to figure out a better way to address the overarching problems. I don't have much faith that these one-off lawsuits will make that much of an impact given that they almost always lead to some fine or settlement that is an acceptable business loss for the company.

I'm all for Meta being decimated by a thousand cuts in the form of lawsuits from various levels of government, but at best it would just be replaced with something else unless more regulation exists at the top levels (US / EU / etc).

Deleted Comment

siliconc0w · 2 years ago
It seems like basically all marketing and advertising is human pen-testing. Thought is serialized into video or audio and then deserialized back into thought which is evaluated. Sometimes this evaluation causes downstream thoughts and actions (including propagating the vulnerability). The question is whether the resulting action is 'organic' or a RCE - overriding the agency of the actor.

I think a core class that should be taught is how to safety deserialize sensory input as to avoid causing RCEs. Or basically 'patching' these known vulnerabilities.

crowcroft · 2 years ago
The nuance with social media ('digital' media generally I guess), is how hard it is for third parties to verify/audit/understand wtf is going on to be able to prove if anything negative is happening.

With broadcast media like TV, I can see what the programming is, and I can watch the same ads that every other house is getting broadcast to know what's being shown to kids (and research companies do this). Similarly for retail media, I can go to a store and see what a retailer is doing.

For Meta with AI newsfeeds and targeted ads, it's impossible to know exactly what any one persons experience is. I don't know the veracity of this specific case is but as a minimum I think there should be some legislation that force these companies to be auditable in some way...

NickC25 · 2 years ago
Should come as no surprise, honestly.

Above all else since turning public, Meta is in the business of making money. It's not illegal to target user's vulnerabilities in order to get the user to spend more time or money on their platform. It's unethical as hell, but it's business 101 - the shareholders would revolt if Zuck came out and said "here's this opportunity to make you all a ton of money, but we're placing our personal ethics above doing this, so we're not". He'd get sued for breach of fiduciary duty.

Now, are Meta's product strategies unethical (or questionably ethical), harmful to society, and setting bad precedent? Yeah, I'd agree with that. But the market and shareholders like money.

BizarreByte · 2 years ago
> It's not illegal to target user's vulnerabilities in order to get the user to spend more time or money on their platform.

Perhaps it should be illegal to target children in such ways? I'm tired of this argument that companies should be able to do whatever they wish in the name of profit, they need to be reigned with strong regulations.

liquidpele · 2 years ago
It’s also BS. Companies have branding and PR teams because they know it matters for profits. The issue with meta, twitter, etc is their real customers are advertisers. They can piss off users, and just pretend some % of bots are users. And let’s be real, few marketing companies are held accountable to their level of actual impact.
megmogandog · 2 years ago
I didn't read the comment you're replying to as endorsing Meta's actions, but rather stating that it is the only thing you should expect given the current lack of regulation.
erellsworth · 2 years ago
I don't really disagree with anything you said, but I would suggest that this kind of thinking is largely at the root of a lot of society's problems. Big corporations have become almost as powerful as governments in many respects. They are integral to our lives in a myriad of was we probably don't even notice. Yet we don't just allow them to be completely amoral, we expect them to behave that way, almost demand that they do.

The result is that we have a lot of amoral institutions playing a key role in our society.

JumpCrisscross · 2 years ago
> shareholders would revolt if Zuck came out and said

Zuckerberg has super-voting shares that give him control over Facebook [1].

[1] https://www.reuters.com/breakingviews/zuckerberg-motivates-s...

GlibMonkeyDeath · 2 years ago
Just as putting actual cocaine in Coca-cola at first also optimized shareholder value. And it was perfectly legal, and not really even considered unethical - heck, it boosted energy, the user gets a beneficial service!

Of course, we know how that worked out. What is galling is that Meta absolutely knows it is creating a bunch of cocaine addicted children.

megaman821 · 2 years ago
Or Coca-cola was made using the coca leaf and the kola nut for flavor and energy reasons. The coca leaf has trace amounts of cocaine in it but is not like they were mixing cocaine powder into their drinks. They still use the coca leaf today but with the cocaine part bred out.

Bringing up wrong historic points about "evil capitalists" doesn't really help your case against Meta.

AlexandrB · 2 years ago
Many of those shareholders are themselves Meta users or have kids who use Meta products. Crazy what kind of masochism "the system" encourages.
floatrock · 2 years ago
> "here's this opportunity to make you all a ton of money, but we're placing our personal ethics above doing this, so we're not". He'd get sued for breach of fiduciary duty.

having a social media company that's a B-corp would be a nice world.

blackhaz · 2 years ago
This is like saying - hey, let everyone buy their guns without background checks, because the shop has to make money!
fnimick · 2 years ago
If background checks weren't legally required, no store would be doing them. Every flagged check is a lost sale after all.