The Dark Side of AI: Deepfake Nudes and Manipulated Audios Fuel School Bullying!

  • Editor
  • June 10, 2024
    Updated
Deepfake-Nudes-and-Manipulated-Audios-Fuel-School-Bullying

In recent years, the rapid advancement of artificial intelligence has introduced a new and alarming dimension to school bullying. This phenomenon, characterized by the creation and dissemination of deepfake nudes and incriminating audio, has escalated the severity and complexity of cyberbullying.

The combination of AI technology with malicious intent has resulted in a surge of incidents that educational institutions, parents, and legal frameworks are struggling to address effectively.

Students are quickly learning the ease with which AI can create nefarious content, opening up a new world of bullying that neither schools nor the law are fully prepared for.

Comment
byu/Ok-Feeling-1743 from discussion
inArtificialInteligence

Educators are watching in horror as deepfake sexual images are created of their students, with sham voice recordings and videos also posing a looming threat. Advocates are sounding the alarm on the potential damage and on gaps in both the law and school policies.

“We need to keep up and we have a responsibility as folks who are supporting educators and supporting parents and families as well as the students themselves to help them understand the complexity of handling these situations so that they understand the context, they can learn to empathize and make ethical decisions about the use and application of these AI systems and tools,” said Pati Ruiz, senior director of education technology and emerging tech at Digital Promise.

At Westfield High School in New Jersey last year, teen boys used AI to create sexually explicit images of female classmates.

“In this situation, there was some boys or a boy — that’s to be determined — who created, without the consent of the girls, inappropriate images,” Dorota Mani, the mother of one of the girls at the school who was targeted, told CNN at the time.

In Pennsylvania, a mother allegedly created AI images of her daughter’s cheerleading rivals naked and drinking at a party before sending them to their coach, the BBC reported.

“The challenge at the time was that the district had to suspend [a student] because they really did think it was her — that she was naked and apparently smoking marijuana,” said Claudio Cerullo, founder of TeachAntiBullying.org.

Schools are having difficulties responding to these new, vicious uses of AI. The Pennsylvania facility could not determine whether the images were fake on their own and had to involve the police. Even experts in the field are just starting to wrap their heads around AI’s destructive power in the schoolyard or the locker room.

In response to these issues, major tech companies are tightening their policies. Google, for instance, has introduced stricter guidelines for AI app developers on its Play Store to prevent the creation of non-consensual explicit content and other harmful material.

Comment
byu/Ok-Feeling-1743 from discussion
inArtificialInteligence

These guidelines mandate rigorous testing of AI tools and require developers to provide mechanisms for users to report offensive content. This move is part of a broader industry effort to mitigate the misuse of AI technologies.

Legislative action is also being taken. The Federal Trade Commission has proposed new measures to ban deepfakes, and a bipartisan group of lawmakers, led by SenateMajorityLeader Chuck Schumer, has called for comprehensive regulations to address AI’s risks, including deepfakes.

Comment
byu/Ok-Feeling-1743 from discussion
inArtificialInteligence

Representative Alexandria Ocasio-Cortez has been vocal about the need for federal protections against non-consensual AI pornography, having experienced such violations herself.

“This is sexual violence,” Rep. Alexandria Ocasio-Cortez (D-N.Y.) said in a video last week promoting legislation to tackle deepfake pornography.

“And what is even crazier is that right now there are no federal protections for any person, regardless of your gender, if you’re a victim of nonconsensual deepfake pornography,” added Ocasio-Cortez, who said she has personally been a victim of such deepfakes.

The proposed Defiance Act aims to establish a federal civil right for victims of non-consensual AI porn, allowing them to seek justice in court. However, addressing these issues, especially when minors are involved, is complex.

Comment
byu/Ok-Feeling-1743 from discussion
inArtificialInteligence

Alex Kotran, co-founder of the AI Education Project, emphasizes the need for societal norms and educational initiatives alongside legal measures.

“The way that we address it might need to go beyond enforcement because it might not be palatable for us to say, well, you know, we’re going to ruin a bunch of these kids’ lives for what might actually just be making a dumb mistake and experimenting,” said Alex Kotran, co-founder and CEO of the AI Education Project. “

Concerns about privacy and over-surveillance add another layer of complexity to this issue. While protecting students from AI-driven bullying, it is crucial to balance these measures with respect for their privacy and autonomy.

Comment
byu/Ok-Feeling-1743 from discussion
inArtificialInteligence

The proliferation of deepfake technology represents a huge challenge for schools, requiring a coordinated response from educators, tech companies, legislators, and communities.

As deepfake technology becomes more accessible and realistic, the need for comprehensive digital literacy and robust protective measures becomes ever more critical.

For more news and insights, visit AI News on our website.

Was this article helpful?
YesNo
Generic placeholder image

Dave Andre

Editor

Digital marketing enthusiast by day, nature wanderer by dusk. Dave Andre blends two decades of AI and SaaS expertise into impactful strategies for SMEs. His weekends? Lost in books on tech trends and rejuvenating on scenic trails.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *