Readit News logoReadit News
teekert · 3 months ago
Not just TikTok, I checked out SnapChat because my kid is the last one in his class to not have it (according to him). First movie I see 2 people falling off an e-bike, pretty painful, then someone making fun of someone with down syndrome, then some weirdly squirming middle aged women with duck faces, and then some very young ones (pretending to?) * off someone off screen while staring into the camera.

Also I denied all access but it still suggested all my sons friends? How? Oh, and it won't even start without access to cameras.

I was pretty shocked. Still, friend off mine, a teacher tells me: You can't let your kid not have SnapChat, it's very important to them.

The Chinese apparently say: Just regulate! TikTok in our country is fun, educational even with safeguards against addiction. Because they mandate it. Somehow we don't want that here? We see it as overreach? Well I'm ready for some overreach (not ChatControl overreach, but you get what I mean). We leave it all up to the parents here, and all parents say: "Well my kid can't be the only one to not have it."

Meanwhile the kids I speak to tell me they regularly have vapeshops popping up in SnapChat, some dudes sell vapes with candy flavors (outlawed here) until the cops show up.

Yeah, we also did stupid things, I know, we grew up, found pron books in the park (pretty gross in retrospect), drank alcohol as young as 15, etc. I still feel this is different. We're just handing it to them.

Edit: Idk if you ever tried SnapChat but it is TikTok, chat, weird AI filters and something called "stories" which for me features a barely dressed girl in a Sauna.

mothballed · 3 months ago
>I was pretty shocked. Still, friend off mine, a teacher tells me: You can't let your kid not have SnapChat, it's very important to them.

Yeah, it's OK to say no.

If the kid wants a phone and snapchat, there's nothing wrong with saying you simply won't be supplying that and if they want it they'd best figure out how to mow lawns. If you're old enough to "need" a phone you're old enough to hustle some yardwork and walk to the T-Mobile store yourself.

muwtyhg · 3 months ago
It's an unfortunate situation where they will be ostracized for lack of participation in social media like Snapchat or TikTok. Children ostracizing those who don't fit in has been a thing forever, but has been thrown into overdrive by ubiquitous social media usage by children.

I don't think making a kid work for the phone is the solution here. The problem is intentionally addictive algorithms being given to children, not a lack of work ethic regarding purchasing a phone.

NickC25 · 3 months ago
Give the kid a dumb phone.

It can make calls. It can send/receive basic text messages.

The most addictive thing on that type of phone, if it can even be installed, is Snake.

Utilitarianism wins. Social media companies loose. I'm fine with that. The kid can still communicate with their parents at a moment's notice.

rolandog · 3 months ago
> I still feel this is different. We're just handing it to them.

I think you are right to be worried, and I think you are correct that it is different:

IIRC, there were some Kremlin leaks some years ago indicating they knew how to "infect" a population with certain propaganda and have the disinformation live on or linger. Together with Meta's/Facebook's (illegal?) study where they experimented on people to try to make them sad by showing them certain types of posts.

So, I think it stands to reason that controlling what you consume means being in control of what you think; in other words: we are what we watch.

We know there are some feedback loops occurring, but I think that it is easier to get desensitized and start becoming accustomed to very extreme content due to the pressure to fit in; perhaps — once one has participated, it might be even harder to be deprogrammed (it requires facing the fact one behaved wrongly towards others).

There's also the fact that being a good person takes a lot of willpower, dedication, is inconvenient and is notoriously difficult to market as "fun".

It is more palatable for an impressionable kid to watch cheap foreign-state-backed radicalizing-propaganda than it is to learn about injustices being perpetuated in our behalf by the state apparatus.

We have developed the habit of being wary of what we consume in order to police our emotions (i.e. minding our mind so no desensitization happens in our watch).

We have seen what the "baddies" can do: the indifference to the suffering they cause, and the cruelty and pettiness they are capable of.

But I digress,... I think you are right to be worried, but I am unsure about how to train kids to not fall into the pipelines.

teekert · 3 months ago
Yeah, just opened it again, saw a girl of like 12 doing a sexy dance, lift her shirt a bit. I report it, they say it's no violation.

Sure it's no violation but it's pedo central in that app I bet, and many kids even share their location with anyone. Parents have no clue.

fragmede · 3 months ago
Or it's our state apparatus that doesn't want teenagers seeing the injustices they're perpetrating, and the think of the children argument is being pushed right now to hide videos of what's going on in Chicago with ICE, and elsewhere.
AlexandrB · 3 months ago
Giving phones to kids was a very bad idea. I don't know how it became normalized.
jordanb · 3 months ago
"Safety" is how it was originally billed: your kids can call you if they get in trouble. They also created apps that let parents spy on where there kids were.
tboyd47 · 3 months ago
Like everything bad is normalized... step by step.
Fire-Dragon-DoL · 3 months ago
Using public transit with google maps is a night and day difference
NickC25 · 3 months ago
I have no issue giving kids a utilitarian dumb phone that can send and receive basic text messages as well as make calls.

Giving them smartphones? Moronic idea at best.

energy123 · 3 months ago
They turn the kid into the weapon, the parents don't hear the end of it until they cave and buy the thing
ricw · 3 months ago
Instagram is the same. Had to install it for a dev project and it was disgusting. Social media just can’t be trusted.
Larrikin · 3 months ago
Why would kids just not immediately switch to something else? This reads like a parent saying video games should only be educational because of course the kid only cares about it being a video game and not the content.

It works in China because they have chat control to the extreme.

teekert · 3 months ago
"It works in China because they have chat control to the extreme."

Not really, they just tell the company to behave in this case. But yes they do have ChatControl to the extreme as well.

dentemple · 3 months ago
> After a “small number of clicks” the researchers encountered pornographic content ranging from women flashing to penetrative sex.

I (40m) don't think I've ever seen literal flashing or literal porn on TikTok, and my algorithm does like to throw in thirst content between my usual hobby stuff.

Are they making the claim that showing porn is a normal behavior for TikTok's algorithm overall, or are they saying that this is something that specifically pervasive with child accounts?

mothballed · 3 months ago
Does TikTok direct what you see based on what other accounts you interact with are interested in? I would expect teenagers to have a different interest profile than your average 40 year old. I would expect algorithms to more or less unwittingly direct you to the kind of stuff your peers were interested in.
netruk44 · 3 months ago
TikTok’s recommendations are based off as much info as it can get, really.

Approximate location, age, mobile OS/browser, your contacts, which TikTok links you open, who generated the links you open, TikTok search history, how long it takes you to swipe to the next video on the for you page, etc.

I don’t think it's really possible to say what TikTok’s algorithm does “naturally”. There’s so many influencing factors to it. (Beyond the promoted posts and ads which people pay TikTok to put in your face)

If you sign up to TikTok with an Android and tell it you’re 16, you’re gonna get recommended what the other 16 year olds with Androids in your nearby area (based on IP address) are watching.

gadders · 3 months ago
Yeah, I've not seen any actual porn either. Just thirst traps.

It might be because I always block anyone with an OF link in their bio, but then that policy doesn't work on Insta.

ivape · 3 months ago
You think thirst traps are okay for kids? If we rewind time, the Girls Gone Wild commercial is not supposed to be even remotely possible on certain channels.

We’re a derelict society that has become numb, “it’s just a thirst trap”.

We’re in the later innings of a hyper-sexualized society.

Why it’s bad:

1) You shift male puberty into overdrive

2) You continue warping young female concepts of lewdness and body image, effectively “undefining” it (lewdness? What is lewdness?).

3) You also continue warping male concepts of body image

mvieira38 · 3 months ago
This content isn't as overt as it may seem, maybe you did come across it and just didn't notice flashing. Those "in the know", generally younger people whose friends told them about flashtok, know what to look for
causal · 3 months ago
Also: kids click on links adult ignore without thinking. Our brains have built in filters for avoiding content we don't want; for kids everything is novel.
saurik · 3 months ago
I wonder when this study happened? FWIW, there was some pretty intense bombing of full-on nudity content to TikTok a month or two ago--it all looked like very automated bot accounts that were suddenly posting scenes with fully nude content cut out of movies--that I saw a number of people surprised were showing up in their feeds. It felt... weaponized? (And it did not last long at all, FWIW: TikTok figured it out. But it was intense and... confusing?)
fsckboy · 3 months ago
>Are they making the claim that showing porn is a normal behavior for TikTok's algorithm overall, or are they saying that this is something that specifically pervasive with child accounts?

the latter is what they tested, but they didn't say specifically pervasive.

you quote the article so it seems like you looked at it, but questions you are curious/skeptical about are things they talked about in the opening paragraphs. it's fine to be skeptical, but they explain their methodology and it is different than the experience you are relying on:

>Global Witness set up fake accounts using a 13-year-old’s birth date and turned on the video app’s “restricted mode”, which limits exposure to “sexually suggestive” content.

>Researchers found TikTok suggested sexualised and explicit search terms to seven test accounts that were created on clean phones with no search history.

>

The terms suggested under the “you may like” feature included “very very rude skimpy outfits” and “very rude babes” – and then escalated to terms such as “hardcore pawn [sic] clips”. For three of the accounts the sexualised searches were suggested immediately.*

>After a “small number of clicks” the researchers encountered pornographic content ranging from women flashing to penetrative sex. Global Witness said the content attempted to evade moderation, usually by showing the clip within an innocuous picture or video. For one account the process took two clicks after logging on: one click on the search bar and then one on the suggested search.

NoGravitas · 3 months ago
Yeah, I (50m) have never encountered literal porn on TikTok. Suggestive stuff, thirst traps, sex ed, sex jokes, yes, but no literal porn or even nudity.
Hizonner · 3 months ago
They are saying that they can find some of that content when they use relatively sophisticated techniques to intentionally try to find it.

They are in the business of whipping up outrage, and should not be given any oxygen.

lupusreal · 3 months ago
> relatively sophisticated techniques

Clicking on thirst trap videos?

thehodge · 3 months ago
Agreed, I've never even seen boobs on TikTok...
elevation · 3 months ago
A soccer mom I know shared that she once tried TikTok. Within seconds of installing the app, the algorithm was showing nsfw content. She uninstalled it.

I assume that the offending content was popular but hadn’t been flagged yet and that the algorithm was just measuring her interest in a trending theme; it seems like it would be bad for business to intentionally run off mainstream users like that.

yapyap · 3 months ago
after reading some of the article it seems to me that they’re saying that on a restricted account thats got the bday of a 13 year old with the suggested search terms tiktok shows and a few clicks you can see actual porn.
ChromaticPanic · 3 months ago
I'm on an unrestricted account and I can't find actual porn. Sounds like this article is rage bait, claiming women in swim wear as porn.
IanCal · 3 months ago
Really? I've signed up to bluesky and tiktok and on both have seen literal porn extremely early without engaging directly (such as liking or responding, speed of scrolling could be something).
InitialLastName · 3 months ago
All of these apps are 100% using your scroll speed/how long you spend engaging with the content as a data point. After all, "time spent engaging with the content" is the revenue driver.

Dead Comment

miyuru · 3 months ago
here is the original article from globalwitness with screenshots.

https://globalwitness.org/en/campaigns/digital-threats/tikto...

I don't know why news sites don't link to the source, but that's another discussion.

ChromaticPanic · 3 months ago
Looks like basic thirst trap content , probably not included to maximize their rage baiting. I don't understand why these puritans don't just move to the middle east. They instead ruin the internet for the rest of us.
0_____0 · 3 months ago
> 3. We have deliberately not included examples of the hardcore pornography that was shown to us.
throwaway2016a · 3 months ago
If you consider "skimpy outfits" pornographic that both Facebook and X are worse than TikTok for me. I've seen a few pieces of content I had to report before but not many.

X, on the other hand, has literal advertisements for adult products on my feed and I get followed by "adult" bot accounts several times a week that when I click through to block them often shows me literal porn. Same with spam facebook friend requests.

I think it boils down to a simple fact that trying to police user-generated content is always going to be an up-hill battle and it doesn't necessarily reflect on the company itself.

> Global Witness claimed TikTok was in breach of the OSA, which requires tech companies to prevent children from encountering harmful content...

Ok, that is noble goal but I feel that the gap between "reasonable measures" and "prevent" is vast.

gjsman-1000 · 3 months ago
> I think it boils down to a simple fact that trying to police user-generated content is always going to be an up-hill battle and it doesn't necessarily reflect on the company itself.

I think it boils down to the simple fact that policing user-generated content is completely possible, it just requires identity verification, which is a very unpopular but completely effective idea. Almost like we rediscovered, for the internet, the same problems that need identity in other areas of life.

I think you will also see a push for it in the years ahead. Not necessarily because of some crazy new secret scheme, but because robots will be smart enough to beat most CAPTCHAs or other techniques, and AI will be too convincing, causing websites to be overrun. Reddit is already estimated to be somewhere between 20% and 40% robots. Reddit was also caught with their pants down by a study recently, with an AI robot on r/changemymind racking up ridiculous amounts of karma undetected.

throwaway2016a · 3 months ago
I'm not convinced that will fix the problem. Even in situations where identity is well known such as work or school, we commonly have bad actors.

It's also pretty unpopular for a good reason.

There is a chilling effect that would go along with it. Like it or not, a lot of people use these social platforms to be their true selves when they can't in their real life for safety reasons. Unfortunately for some people their "true self" is pretty trashy. But it's a slippery slope to put restrictions (like ID verification) on everyone just because of a few bad actors.

Granted I'm sure there's some way we could do that while maintaining moderate privacy but it's technologically challenging and I'm not alone in wanting tech companies to have less of my personal information not more.

yread · 3 months ago
Heh on Facebook you don't even need any clicks. I logged in after a few years and the first video among the facebook shorts or whatever it's called was a woman removing her underwear.
random9749832 · 3 months ago
Breaking News: If you leave your child on TikTok unregulated you are an idiot.

The world is hostile and full of exploitation. It is no different on the internet.

throwacct · 3 months ago
Exactly. I don't understand why children "need" to use any social network. We were raised without it, and so can any kid, for that matter.
dns_snek · 3 months ago
You weren't ostracized for not engaging on social media because it didn't exist.

In 10-15 years Gen Z will be complaining about how their generation didn't need to have the most expensive AI boyfriend/girlfriend to avoid getting bullied, or something ridiculous like that.

duxup · 3 months ago
It's funny how this works when it comes to undesirable content:

The internet is full of "bad" (or at least undesirable under some circumstances) content. It's there on the interent, we kinda accept its existence.

Then we train AI on it, and we're upset if it regurgitates it, so we have to add "safety".

Meanwhile social media sends you right to it ...

zakki · 3 months ago
On X, if you click a trending topic and scroll them down you'll see porn content quiet soon. #Indonesia trending topic.