An abuse survivor can sue Visa over videos of her posted to Pornhub, a US court has ruled.
Serena Fleites was 13 in 2014 when, it is alleged, a boyfriend pressured her into making an explicit video which he posted to Pornhub.
Ms Fleites alleges that Visa, by processing revenue from ads, conspired with Pornhub’s parent firm MindGeek to make money from videos of her abuse.
Visa had sought to be removed from the case.
Ms Fleites’ story has featured in the New York Times article The Children of Pornhub – an article which prompted MindGeek to delete millions of videos and make significant changes to its policies and practice.
Her allegations are summarised in the pre-trial ruling of the Central District Court of California.
Millions of views
The initial explicit video, posted to Pornhub without her knowledge or consent, had 400,000 views by the time she discovered it, Ms Fleites says.
She alleges that after becoming aware of the video, she contacted Mindgeek pretending to be her mother “to inform it that the video qualified as child pornography”. A few weeks later it was removed
But the video was downloaded by users and re-uploaded several times, with one of the re-uploads viewed 2.7 million times, she argues.
MindGeek earned advertisement revenue from these re-uploads, it is alleged.
Ms Fleites says her life had “spiralled out of control” – there were several failed suicide attempts and family relationships deteriorated – then while living at a friend’s house, an older man introduced her to heroin.
To fund her addiction, while still a child, she created further explicit videos at this man’s behest, some of which were uploaded to Pornhub.
“While MindGeek profited from the child porn featuring Plaintiff, Plaintiff was intermittently homeless or living in her car, addicted to heroin, depressed and suicidal, and without the support of her family,” Judge Cormac J. Carney’s summary of her allegations says.
MindGeek told the BBC that at this point in the case, the court has not yet ruled on the truth of the allegations, and is required to assume all of the plaintiff’s allegations are true and accurate.
“When the court can actually consider the facts, we are confident the plaintiff’s claims will be dismissed for lack of merit,” the company said.
‘The tools to complete the crime’
The Judge ruled that, at the current stage of proceedings, “the Court can infer a strong possibility that Visa’s network was involved in at least some advertisement transactions relating directly to Plaintiff’s videos”.
But Visa argued that the “allegation that Visa recognized MindGeek as an authorized merchant and processed payment to its websites does not suggest that Visa agreed to participate in sex trafficking of any kind”.
It also argued, according to the judge’s account of its position, that a commercial relationship alone does not establish a conspiracy.
But Judge Carney said that, again at this stage of proceedings, “the Court can comfortably infer that Visa intended to help MindGeek monetize child porn from the very fact that Visa continued to provide MindGeek the means to do so and knew MindGeek was indeed doing so.
“Put yet another way, Visa is not alleged to have simply created an incentive to commit a crime, it is alleged to have knowingly provided the tool used to complete a crime”.
A spokesperson for Visa told the BBC that it condemned sex trafficking, sexual exploitation and child sexual abuse material.
“This pre-trial ruling is disappointing and mischaracterizes Visa’s role and its policies and practices. Visa will not tolerate the use of our network for illegal activity. We continue to believe that Visa is an improper defendant in this case.”
Zero tolerance
Last month MindGeek’s chief executive officer and chief operating officer resigned.
The senior departures followed further negative press in an article in the magazine the New Yorker, examining among other things the company’s moderation policies.
Mindgeek told the BBC that it has:
- zero tolerance for the posting of illegal content on its platforms
- banned uploads from anyone who has not submitted government-issued ID that passes third-party verification
- eliminated the ability to download free content
- integrated several technological platform and content moderation tools
- instituted digital fingerprinting of all videos found to be in violation of our Non-Consensual Content and CSAM Policies to help protect against removed videos being reposted
- expanded its moderation workforce and processes
The company also said that any insinuation that it does not take the elimination of illegal material seriously is “categorically false”.
Source: bbc.co.uk
Be the first to comment