I get the purported intention of the article, which is helping women by sharing their stories of torment. On a second level, it smears OnlyFans and, I suspect, tries to shame OnlyFans into protecting "involuntary creators" better.
But: I can't help but feel that the focus on OF is misguided. There is only so much you can do when there's an abuser nearby who scares you so much you won't even talk to police — who can show up with guns and jail the abuser for 20 years!
In fact, the article mentions that OF is so attractive because the revenue share for the creator (or abuser) is so high. It's not low standards for verification, or anything like that.
I guess not many people would accept a technical solution like having to re-verify your account at a communal office every 6 months with a government ID issued to your real name in order to upload porn...
OF could, if they don't have it already, add a button "I'm forced to upload against my will" and then forward the information to local police. With financial transactions, there's always a real ID trackable.
> OF could, if they don't have it already, add a button "I'm forced to upload against my will" and then forward the information to local police. With financial transactions, there's always a real ID trackable.
I think there is a common scenario where the victims don't upload themselves. The most prominent in the article, where the guy had 14 guns or so, probably was such a case, even though not explicitly stated.
> I guess not many people would accept a technical solution like having to re-verify your account at a communal office every 6 months with a government ID issued to your real name in order to upload porn...
Can you confirm your logic, please? Let me paraphrase: if a tech T has an undesired side effect X, and removing X reduces T's market, we should accept X?
If that's the case, I can see a few cases where this applies. For example, we allow tobacco despite of its known disastrous effects.
But, honestly, I am being appalled by the immediate jump to "meh, that's how capitalism works". Your argument is "the only solution I could come up with in 30 seconds would devalue the platform, so them poor bitches will have to live through this!".
Here is a simple thing OF could do: Aggressively inform users and creators of "signal for help" [1].
Here is a more complex thing OF could do: monitor the content for suspicous content, just as they are doing for child porn–they can use the exact same mechanisms, but screen for typical features of abused women, in the payments, etc. That's all basic stuff OF apparently does not do today. If 5 women work towards the same billing information of a single male, that's a red flag for sure.
> But: I can't help but feel that the focus on OF is misguided. There is only so much you can do when there's an abuser nearby who scares you so much you won't even talk to police — who can show up with guns and jail the abuser for 20 years!
Actually, I think if you really want to it is much easier to get hold of sex traffickers now OF exists. Why? Because it is documented, you might have channels to the victims, you have business information of the perpetrators, etc.
Prostitutes have lots of contacts with clients. Even if they work in a brothel, they meet government officials – ranging from the tax man to the health inspector from time to time.
Involuntary actresses in the porn industry are free to speak to anyone. And yet GirlsDoPorn was in business for years.
I'm saying: Keep your expectations realistic. My comment was also a refute to a sibling comment where someone claimed all problems would be solved if the creators had to verify their account. The 99% legitimate users might object to that. (Not the platform OF itself!)
What do you think will happen to a forced creator when the abuser screens the content for help signals before uploading? What if they don't know about it because they never interact with the platform directly?
At the end of the day, abuse is not a technical problem.
I didn’t flag it but I can’t help but notice how the stories are of awful abusers and their victims while somehow the article puts the blame for their suffering on OnlyFans.
The tone and the subtext seem to suggest that it is somehow OnlyFans who is causing women to be “deceived, drugged, terrorized and sexually enslaved” rather than the people who are perpetrating the actual deceiving, drugging, terrorizing, and sexual enslavement. (Why name the service OnlyFans but not the abuser who poured hot oil on their victim’s back?)
Just look at the other comments: any time someone mentions sex, somehow people get stupid. I think the flags come from at least two directions: those who are opposed to anything sexual, and those who are fed up with the cess pits these stories become on HN.
It’s a shame these things cannot be discussed here, but at the same time this is probably not the best demographics for this kind of topics.
Now that I write it, I will probably be wrong, but I don’t expect this story to remain on the front page for long.
The current OP is from a month or so ago, whereas the linked article in the HN post above is from yesterday. They're both worth reading and are in the same series of articles from Reuters, and neither should be flagged imo, though I think the latest article, about the origins of the site, is of more relevance to HN generally.
I presume you are right that the previous article was flagged, but today's article was also killed by flags. Unless there was a dupe, it was flagged and dead when I found it on hckrnews.com. I "vouched" for it to bring it back alive because I thought it was a solid article that was worthy of discussion.
As to why it was flagged, it's always hard to say. A combination of people who think it's a hit piece, those who think it's too accepting of porn in general, and those who think it's simply inappropriate for a work oriented tech site. And possibly some people who simply don't want to see evidence that this particular problem exists.
I vouched for it because I think it's good for porn consumers to contemplate whether their behavior is enabling abusers like the one described here. If you are going to watch porn, at least choose performers who are clearly happy to be doing what they are doing and thus less likely to be forced to do so against their will!
WarOnPrivacy: You should be able to read comments on dead article if you turn on "showdead" in your profile (click your name on the top right of the page).
I'm speculating here, not asserting this as ground truth, but I get the impression that HN generally likes to avoid sexual/NSFW content, even if it is simply discussions involving those topics, and not simply gratuitous depictions.
I suspect a lot of folks use HN at work and just want to err on the side of caution for what they have on their work computers.
It's impossible to verify consent remotely. OnlyFans should be required to shut down their website until they can verify consent IN PERSON for all their creators. And if doing so is economically infeasible then they shouldn't be externalizing costs like systematic rape. </CaptainObvious>
Even with consent I don't see any positive coming out of it: for women it's unrealisting expectations of the money they are getting in exchange for potentially leaving an online trace for their whole life that they don't consent to and probably don't even know of in advance.
For men it's unrealistic expectations of getting a (short or long but real) relationship with a woman that in reality looks down on them and don't even view them as real men.
It should be illegal as it's destroying society, but unfortunately it isn't.
That was my thought. They might even need to periodically re-verify in person. And ideally they should verify in person with a witness, like getting a driver's license or passport.
But: I can't help but feel that the focus on OF is misguided. There is only so much you can do when there's an abuser nearby who scares you so much you won't even talk to police — who can show up with guns and jail the abuser for 20 years!
In fact, the article mentions that OF is so attractive because the revenue share for the creator (or abuser) is so high. It's not low standards for verification, or anything like that.
I guess not many people would accept a technical solution like having to re-verify your account at a communal office every 6 months with a government ID issued to your real name in order to upload porn...
OF could, if they don't have it already, add a button "I'm forced to upload against my will" and then forward the information to local police. With financial transactions, there's always a real ID trackable.
I think there is a common scenario where the victims don't upload themselves. The most prominent in the article, where the guy had 14 guns or so, probably was such a case, even though not explicitly stated.
> I guess not many people would accept a technical solution like having to re-verify your account at a communal office every 6 months with a government ID issued to your real name in order to upload porn...
Can you confirm your logic, please? Let me paraphrase: if a tech T has an undesired side effect X, and removing X reduces T's market, we should accept X?
If that's the case, I can see a few cases where this applies. For example, we allow tobacco despite of its known disastrous effects.
But, honestly, I am being appalled by the immediate jump to "meh, that's how capitalism works". Your argument is "the only solution I could come up with in 30 seconds would devalue the platform, so them poor bitches will have to live through this!".
Here is a simple thing OF could do: Aggressively inform users and creators of "signal for help" [1].
Here is a more complex thing OF could do: monitor the content for suspicous content, just as they are doing for child porn–they can use the exact same mechanisms, but screen for typical features of abused women, in the payments, etc. That's all basic stuff OF apparently does not do today. If 5 women work towards the same billing information of a single male, that's a red flag for sure.
> But: I can't help but feel that the focus on OF is misguided. There is only so much you can do when there's an abuser nearby who scares you so much you won't even talk to police — who can show up with guns and jail the abuser for 20 years!
Actually, I think if you really want to it is much easier to get hold of sex traffickers now OF exists. Why? Because it is documented, you might have channels to the victims, you have business information of the perpetrators, etc.
[1] https://en.wikipedia.org/wiki/Signal_for_Help
Involuntary actresses in the porn industry are free to speak to anyone. And yet GirlsDoPorn was in business for years.
I'm saying: Keep your expectations realistic. My comment was also a refute to a sibling comment where someone claimed all problems would be solved if the creators had to verify their account. The 99% legitimate users might object to that. (Not the platform OF itself!)
What do you think will happen to a forced creator when the abuser screens the content for help signals before uploading? What if they don't know about it because they never interact with the platform directly?
At the end of the day, abuse is not a technical problem.
While I'm not eager to engage the topic, I'm curious why was it flagged. Does anyone know?
The tone and the subtext seem to suggest that it is somehow OnlyFans who is causing women to be “deceived, drugged, terrorized and sexually enslaved” rather than the people who are perpetrating the actual deceiving, drugging, terrorizing, and sexual enslavement. (Why name the service OnlyFans but not the abuser who poured hot oil on their victim’s back?)
This is false. The article does name the abuser.
It’s a shame these things cannot be discussed here, but at the same time this is probably not the best demographics for this kind of topics.
Now that I write it, I will probably be wrong, but I don’t expect this story to remain on the front page for long.
https://news.ycombinator.com/item?id=42541269
The current OP is from a month or so ago, whereas the linked article in the HN post above is from yesterday. They're both worth reading and are in the same series of articles from Reuters, and neither should be flagged imo, though I think the latest article, about the origins of the site, is of more relevance to HN generally.
As to why it was flagged, it's always hard to say. A combination of people who think it's a hit piece, those who think it's too accepting of porn in general, and those who think it's simply inappropriate for a work oriented tech site. And possibly some people who simply don't want to see evidence that this particular problem exists.
I vouched for it because I think it's good for porn consumers to contemplate whether their behavior is enabling abusers like the one described here. If you are going to watch porn, at least choose performers who are clearly happy to be doing what they are doing and thus less likely to be forced to do so against their will!
WarOnPrivacy: You should be able to read comments on dead article if you turn on "showdead" in your profile (click your name on the top right of the page).
I suspect a lot of folks use HN at work and just want to err on the side of caution for what they have on their work computers.
PART 1 Behind the OnlyFans porn boom: allegations of rape, abuse and betrayal
-
PART 2 OnlyFans vows it’s a safe space. Predators are exploiting kids there.
-
PART 3 OnlyFans’ porn juggernaut fueled by a deception
-
PART 4 Sex bomb: The collateral damage of OnlyFans’ explosive success
-
PART 5 Enslaved on OnlyFans: Women describe lives of isolation, torture and sexual servitude
-
PART 6 Multiple OnlyFans accounts featured suspected child sex abuse, investigator reports
-
PART 7 How OnlyFans turned into a global empire bent on redefining porn
That’s too specific. It’s just a paywall system akin to MediaPass, MemberGate, and similar
> For a head of a famous company, he is little known.
> Even with specialized search tools, Reuters could find only six different photos of Radvinsky online.
Instantly obvious that he is not dumb.
Dead Comment
For men it's unrealistic expectations of getting a (short or long but real) relationship with a woman that in reality looks down on them and don't even view them as real men.
It should be illegal as it's destroying society, but unfortunately it isn't.
Not as easy as recording a vid.