Hulk Hogan's Sex Tape Lawyer Fights For Westfield Teen Over AI Porn

WESTFIELD, NJ — A Westfield High School student who was “exploited, abused, and victimized” by artificial intelligence fake pornographic images last year has filed a lawsuit in federal court against a fellow student accused of creating the images.

The lawsuit filed by the student, identified as Jane Doe, and her parents was filed earlier this month in the United Stated District Court of New Jersey against the student, identified as K.G., and his parents, identified as R.G. and K.G.

Jane Doe, who is 15 years old, is represented by John Gulyas of Clark and Shane Vogt of Florida. Vogt is nationally known for representing American professional wrestler Hulk Hogan in a sex tape lawsuit against Tampa Bay area radio host Bubba the Love Sponge Clem.

Find out what's happening in Westfieldwith free, real-time updates from Patch.

“Out of respect for the Court and legal process and to ensure the anonymity of everyone involved, we cannot comment on any specifics about this case but hope it is successful and will demonstrate that there is something victims can do to protect themselves from the AI pornography epidemic,” said Vogt to Patch. “All the girls and women being victimized by nonconsensual pornography created using artificial intelligence deserve to have someone willing to fight for them and their privacy, and we are extremely proud and humbled to have been given the responsibility to do that for our incredibly brave client in this case.”

The lawsuit alleges that in August 2023, K.G. took screenshots of or downloaded photos of Jane Doe and several other minor girls, who were clothed, from their social media accounts.

Find out what's happening in Westfieldwith free, real-time updates from Patch.

K.G. then allegedly used an A.I. website or app believed to be “ClothesOff.io” to remove the clothing from the girls’ original photos to generate nonconsensual nude images of Jane Doe and these other minor girls.

The lawsuit alleges K.G. then distributed and shared the nude photos, including an image or images of Jane Doe nude, to Jane Doe’s classmates and possibly others through the Internet via a Snapchat group.

A district spokesperson confirmed to Patch in November 2023 the incident happened over the summer and was brought to the attention of school officials in October. The district conducted an “immediate investigation,” according to the spokesperson, and notified Westfield police.

“Jane Doe never consented to Defendant’s use of her image to create nude images of her and never consented to Defendant receiving, possessing, saving, disclosing or disseminating photos depicting her nude,” according to the lawsuit.

Jane Doe’s parents said they were first made aware of the situation when called them on Oct. 20, 2023. The lawsuit also says K.G.’s father called Jane Doe’s parents to tell them what happened.

The incident prompted an investigation by the Westfield Police Department. On Jan. 24, 2024, the lawsuit alleges that police told Jane Doe and her parents “that charges could not pursued at that time because the facts gathered by Jane Doe’s school could not be used to support the investigation and because Defendant and other potential witnesses failed to cooperate with, speak to, or provide access to their electronic devices to law enforcement.”

The lawsuit also alleges that Westfield Police “never determined the extent of the nude photos’ dissemination, never ensured that no further dissemination occurred, and never ensured the nude photos had been deleted and were no longer accessible.”

Westfield Police did not immediately respond to Patch’s request for comment.

This AI generated incident has since prompted political officials to take a stand.

In November 2023, Senator Jon Bramnick(District 21) to sponsor a bill and join the fight against deepfake pornography. Read More: Bramnick Is Fighting AI Made Porn After Westfield Girls Victimized

In January 2024, Congressman Tom Kean, Jr. (NJ-07) and Congressman Joe Morelle (NJ-25) hosted a joint press conference, taking action to end AI generated deepfake pornography. They were joined by Dorota and Francesca Mani who are working to pass HR 6466, the AI Labeling Act of 2023 and HR 3106, the Preventing Deepfakes of Intimate Images Act.

“Victims of child and nonconsensual pornography in which their actual faces appear, including Jane Doe, are not only harmed and violated by the creation of such images, but they are also haunted for the rest of their lives by knowing that they were and likely will continue to be exploited for the sexual gratification of others and that, absent court intervention, there is an everlasting threat that such images will be circulated in the future,” according to the lawsuit.

The lawsuit alleges Jane Doe suffered and will continue to suffer substantial reputational harm and the psychological harm of knowing that images depicting her nude were created, saved, exploited, disclosed, and disseminated amongst her classmates and possibly others, including for morbid and abhorrent sexual gratification.

“Victims of nonconsensual and child pornography such as Jane Doe are left to cope with the psychological impacts of knowing that images such as the Nude Photos almost inevitably make their way onto the Internet where they are retransmitted to others, such as pedophiles and traffickers, resulting in a sense of hopelessness and perpetual fear that at any time such images can reappear and be viewed by countless others, possibly even their friends, family members, future partners and employers, or the public at large,” according to the lawsuit.

The lawsuit is seeking $150,000, attorney fees, litigation costs, and equitable relief including a temporary restraining order for invasion of privacy and intentional infliction Of emotional distress.

See the full lawsuit below:

jane doe v kg by Alexis Tarrazi on Scribd

>

Click Here:

Have a news tip? Email alexis.tarrazi@patch.com.


Get more local news delivered straight to your inbox. Sign up for free Patch newsletters and alerts.