In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’

bnew

Veteran
Joined
Nov 1, 2015
Messages
44,344
Reputation
7,364
Daps
134,308

ARTIFICIAL INTELLIGENCE

In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’​

While the police investigate, the mothers of the affected have organized to take action and try to stop those responsible​



MANUEL VIEJO

Madrid - SEP 18, 2023 - 18:44 CEST

UREKE5OPY5BWHEX3CDJKBKOXF4.jpg

Officers of the local police of Almendralejo.@POLICIALOCALALM


Back to school. First day of class. Isabel, 14 years old, went last Tuesday to her high school in Almendralejo (Extremadura, Spain), a municipality with almost 30,000 residents where practically everyone knows each other. That morning, she entered the schoolyard to find a rumor spreading from group to group. It was all everyone was talking about: there were photos of naked female classmates being passed around everyone’s phones. Isabel (her name has been changed at the request of her mother) went out to recess with her friends. They were in shock. Suddenly, a boy approached her and said: “I saw a naked photo of you.”

The young girl was afraid. After school, she returned home, and the first thing she did was tell her mother. “Mom, they say there’s a naked photo of me going around. That they did it with an artificial intelligence app. I’m scared. Some girls have also received it.” Sara, her 44-year-old mother, immediately contacted the mother of her daughter’s best friend, who had also just told her family about the situation. After talking, the mothers started making calls; by then, there were more than 20 girls affected. That is when a mother decided to create a WhatsApp group to better coordinate with everyone. That Monday, there were already 27 people in the group.

Almendralejo has five middle schools and, in at least four of them, AI-generated images of naked students have been spread. Police sources in Extremadura report that they are aware of seven complaints, so far. The case is being investigated by the Almendralejo judicial police. In fact, they have already identified “several” of the alleged authors of the photographic montages, according to officials. The case has been placed in the hands of the Juvenile Prosecutor’s Office.

Sara filed her complaint last Friday. When she arrived at the police station, she ran into another mother who was just coming out the door. Fátima Gómez, 30 years old, has a 12-year-old daughter. She found out about the case last Wednesday night around 10:00 p.m., when the mother of one of her daughter’s friends called her to tell her: “I saw a naked photo of your daughter. It’s a montage.”

Gómez suffered an anxiety attack. Later, she had a conversation with her daughter: “Do you know anything about a naked photo?” The girl did not hesitate. She said yes, and showed her mother a recent Instagram conversation she had with a boy. In it, he asks her to give him “some money.” When she refused, the boy immediately sent her a naked photo of herself. All she could do was block the contact. The police believe that there is a false profile behind this account.

As the number of affected girls kept increasing, the group of mothers kept growing. One of the mothers is Miriam Al Adib, a 46-year-old gynecologist. She has an Instagram profile with more than 120,000 followers. There, last Sunday, she made a live stream to talk about what had just happened at her home. The video already has more than 70,000 views. “I just got back from a trip; this is very serious, and I have to share it,” she says.

Al Adib, who has four daughters between 12 and 17 years old, tells EL PAÍS that she had just arrived from a trip from Barcelona, where she went to give some talks on female sexual health. After eating, her 14-year-old daughter approached her and said: “Mom, look at what happened. They have done this to many girls.” Then the girl showed her the photo of herself naked. “My heart skipped a beat,” Al Adib says. “If I didn’t know my daughter’s body, this photo looks real.” After that, the girl let her know that a friend’s mother was going to call her because, apparently, they were organizing in a WhatsApp group.

The mother of the other girl told her on the phone that there were many affected. “Some know that there are naked photos of their daughters, but they don’t have them,” she explained. Then Al Adib told the others that she had a platform where she could make a video explaining the situation, to try to reach the kids who are sending these photos, to make some noise. “This is a village, and we know — we know what’s going on.” The 10-minute video, where she tells what happened to her daughter, is accompanied by a text: “This, girls, won’t be tolerated. STOP THIS NOW. Girls, don’t be afraid to report such acts. Tell your mothers. Affected mothers, tell me, so that you can be in the group that we created.”



The reaction was one of massive support for all the affected mothers, with private and public messages urging them to continue and report. “All is not lost in society,” says Al Adib. “The feeling that women do not remain silent is a fact. We are no longer ashamed. We are victims, and now we can speak because society supports us. That’s the message that I have given my daughters, and they should never forget it.”

The investigation, according to police sources, remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
 

Jaguar93

Superstar
Joined
Jan 28, 2016
Messages
7,139
Reputation
1,909
Daps
39,862
:whoa:Them circles been doing this since photoshop. As technology advances it’s going to get even crazier. Just look at deep fakes smh. And with countless of women posting their pictures on social media. They’re just supplying content for them creeps to alter.
 
Last edited:

bnew

Veteran
Joined
Nov 1, 2015
Messages
44,344
Reputation
7,364
Daps
134,308

For teen girls victimized by ‘deepfake’ nude photos, there are few, if any, pathways to recourse in most states​

The FBI has warned that technology used to create pornographic deepfake photos and videos was improving and being used for harassment and sextortion.

Glitchy photo of young woman's face and neck

Leila Register / NBC News; Getty Images

Nov. 23, 2023, 8:03 AM EST

By Melissa Chan and Kat Tenbarge

Teenage girls in the U.S. who are increasingly being targeted or threatened with fake nude photos created with artificial intelligence or other tools have limited ways to seek accountability or recourse, as schools and state legislatures struggle to catch up to the new technologies, according to legislators, legal experts and one victim who is now advocating for a federal bill.

Since the 2023 school year kicked into session, cases involving teen girls victimized by the fake nude photos, also known as deepfakes, have proliferated worldwide, including at high schools in New Jersey and Washington state.

Local police departments are investigating the incidents, lawmakers are racing to enact new measures that would enforce punishments against the photos’ creators, and affected families are pushing for answers and solutions.

Unrealistic deepfakes can be made with simple photo-editing tools that have existed for years. But two school districts told NBC News that they believe fake photos of teens that have affected their students were AI-generated.

AI technology is becoming more widely available, such as stable diffusion (open-source technology that can produce images from text prompts) and “face-swap” tools that can put a victim’s face in place of a pornographic performer’s face in a video or photo.

Apps that purport to “undress” clothed photos have also been identified as possible tools used in some cases and have been found available for free on app stores. These modern deepfakes can be more realistic-looking and harder to immediately identify as fake.

“I didn’t know how complex and scary AI technology is,” said Francesca Mani, 15, a sophomore at New Jersey’s Westfield High School, where more than 30 girls learned on Oct. 20 that they may have been depicted in explicit, AI-manipulated images.

“I was shocked because me and the other girls were betrayed by our classmates,” she said, “which means it could happen to anyone by anyone.”

Politicians and legal experts say there are few, if any, pathways to recourse for victims of AI-generated and deepfake pornography, which often attaches a victim’s face to a naked body.

The photos and videos can be surprisingly realistic, and according to Mary Anne Franks, a legal expert in nonconsensual sexually explicit media, the technology to make them has become more sophisticated and accessible.

A month after the incident at Westfield High School, Francesca and her mother, Dorota Mani, said they still do not know the identities or the number of people who created the images, how many were made, or if they still exist. It’s also unclear what punishment the school district doled out, if any.

The Town of Westfield directed comment to Westfield Public Schools, which declined to comment. Citing confidentiality, the school district previously told NBC New York that it “would not release any information about the students accused of creating the fake nude photos, or what discipline they are facing.”

Superintendent Raymond Gonzalez told the news outlet that the district would “continue to strengthen our efforts by educating our students and establishing clear guidelines to ensure that these new technologies are used responsibly in our schools and beyond.”

In an email obtained by NBC News, Mary Asfendis, the high school’s principal, told parents on Oct. 20 that it was investigating claims by students that some of their peers had used AI to create pornographic images from original photos.

At the time, school officials believed any created images had been deleted and were not being circulated, according to the memo.

“This is a very serious incident,” Asfendis wrote, as she urged parents to discuss their use of technology with their children. “New technologies have made it possible to falsify images and students need to know the impact and damage those actions can cause to others.”

While Francesca has not seen the image of herself or others, her mother said she was told by Westfield’s principal that four people identified Francesca as a victim. Francesca has filed a police report, but neither the Westfield Police Department nor the prosecutor’s office responded to requests for comment.

New Jersey State Sen. Jon Bramnick said law enforcement expressed concerns to him that the incident would only rise to a “cyber-type harassment claim, even though it really should reach the level of a more serious crime.”

“If you attach a nude body to a child’s face, that to me is child pornography,” he said.

The Republican lawmaker said state laws currently fall short of punishing the content creators, even though the damage inflicted by real or manipulated images can be the same.

“It victimizes them the same way people who deal in child pornography do. It’s not only offensive to the young person, it defames the person. And you never know what’s going to happen to that photograph,” he said. “You don’t know where that is once it’s transmitted, when it’s going to come back and haunt the young girl.”

A pending state bill in New Jersey, Bramnick said, would ban deepfake pornography and impose criminal and civil penalties for nonconsensual disclosure. Under the bill, a person convicted of the crime would face three to five years in jail and/or a $15,000 fine, he said.

If passed, New Jersey would join at least 10 other states that have enacted legislation targeting deepfakes, according to Franks, a law professor and the president of the Cyber Civil Rights Initiative, a nonprofit group that combats nonconsensual porn.

The state laws targeting deepfakes vary widely in scope. Some of them, like ones in Texas and Wyoming, make nonconsensual pornographic deepfakes a criminal violation. Other states, like New York, have laws that only allow victims to bring forward a civil suit.

Franks said the laws are “all over the place,” noncomprehensive, and the constitutionality of the laws has been called into question.

“So you’ve got a patchwork of criminal charges, which are going to be difficult in these cases because the perpetrators are going to be minors, so that raises its own questions,” she said.


‘Probably just the tip of the iceberg’

It’s unclear how many young people have been victimized by AI-generated nudes.

The FBI said it is difficult to calculate the number of minors who are sexually exploited. But the agency said it has seen a rise in the number of open cases involving crimes against children. There were more than 4,800 cases in 2022, which grew from more than 4,100 the year before, the FBI told NBC News.

“The FBI takes crimes against children seriously and works to investigate the facts of each allegation in a collective effort with our state, local, and tribal law enforcement partners,” the agency said, adding that victims can face significant challenges when trying to stop the spread of the image or get it removed from the internet.

Franks said there are likely a lot more incidents and that they will only increase.

“Whatever we’re hearing about that floats up to the surface is probably just the tip of the iceberg,” she said. “This is probably happening quite a bit right now, and girls just haven’t found out about it yet or discovered it or the school is covering it up.”

At Issaquah High School in Washington state, a school district representative said a mid-October incident “involving fake, AI-generated imagery of students” continues to affect the student body.

In the Spanish town of Almendralejo, mothers say dozens of their middle school-aged daughters have been victimized with AI-generated nude photos created with an app that can “undress” clothed photos. Local police in New Jersey, Washington and Spain are investigating the high school cases.

In a June public service announcement, the FBI warned that technology used to create nonconsensual pornographic deepfake photos and videos was improving and being used for harassment and sextortion.

Meanwhile, the National Association of Attorneys General called on Congress in September to study AI’s effects on children and come up with legislation that would protect them from those abuses.

In a letter signed by 54 state and territory attorneys general, the group said it was concerned that “AI is creating a new frontier for abuse that makes prosecution more difficult.”

"We are engaged in a race against time to protect the children of our country from the dangers of AI,” the letter said.

Francesca and her mother said they plan to head to Washington, D.C., in December to personally urge Congress members to act, as they continue to advocate for updated policies within the school system and seek accountability for what happened.

“We all know this is not an isolated incident,” Dorota Mani said. “It will never be an isolated incident. This is going to keep happening all the time. We have to stop pretending that it’s not important.”

The rise in incidents targeting high school girls follows the proliferation of AI deepfake apps and deepfake porn websites where such material is created, shared and sold.

A 2019 report from Sensity, an Amsterdam-based company that tracks AI-generated media, found that 96% of deepfakes created at that point were sexually explicit and featured women who didn’t consent to their creation. Many victims are unaware the deepfakes exist.

Franks said there is nothing parents and children can do to prevent the creation of deepfakes using their likenesses. Instead, Franks said schools and local law enforcement need to make an example out of perpetrators in cases that reach the general public, to discourage others from creating deepfakes.

“If you could imagine a dramatic and important response from the school in New Jersey or from the authorities in New Jersey to make an example out of the case, really strict penalties, people go to jail, you might get the discouragement,” Franks said.

“In the absence of that, it’s just going to become one more tool that men and boys use against women and girls to exploit and humiliate them and that the law basically has nothing to say about.”
 

concise

Veteran
Joined
Apr 30, 2012
Messages
37,742
Reputation
3,253
Daps
90,121
Social media companies need to support laws punishing this because they need people to keep using them ... otherwise ...
 
Top