

Christians United Against
Human Trafficking
501(c)3 non-profit organization - your support
of this ministry is tax deductible
What Are Deep Fakes?
In short, Deep Fakes are images or videos produced using artificial intelligence (AI). Sadly, criminals are jumping on this and will take an innocent image from someone's social media (could be a child or an adult) and will use it to product illicit content which makes it "appear" that the individual in the picture or video is the one performing sexual acts. The victim may not have any idea their image is being used until someone recognizes them and questions them about it. It is using the image of another person to produce illicit content for profit without their consent.
It is a crime to use someone's picture or image for sexual imagery without their expressed consent! If that is happening to you or someone you know, contact local law enforcement.
347 fake nude images of 60 high school girls.
Created by AI. Shared without consent.
The girls' crime? Existing online. Having Instagram. Going to school.
This isn't a "scandal." It's sexual abuse with new tools—and most schools have no policy for it.
This happened in Pennsylvania. Two boys used free AI tools to generate fake nudes of their female classmates and circulated them on the Discord App.
The girls did nothing. They didn't "send nudes." They posted regular photos online—selfies, group shots, school pictures. That was enough data for AI to strip and sexualize them.
And this is not isolated.
Girls across the U.S., UK, Australia, and beyond are facing the same thing.
The technology is free, easy to use, and spreading fast. The laws? Patchwork at best. Most countries are scrambling to catch up.
Schools often respond by blaming girls for being online in the first place. Which is victim-blaming dressed up as "digital safety."
If you're a mother, auntie, teacher, or mentor:
1. Talk about deepfakes explicitly—no shame, all facts
Don't wait for "the right time." Have the conversation now:
"Images can be faked using AI. If someone creates a fake nude of you, it's not your fault. It's abuse. And we will support you, not blame you."
Make it clear: the problem isn't her online presence. It's the boys weaponizing technology.
2. Push your school for clear deepfake policies
Most schools have no protocol. Ask:
How does the school classify deepfakes? (It should be sexual harassment/abuse, not "cyberbullying")
What are the consequences for perpetrators?
How will victims be protected and supported?
Is there a reporting pathway that doesn't re-traumatize girls?
If the school doesn't have answers, that's your signal to organize other parents and demand them.
3. Know the law in your area
Find out what protections exist where you live. Contact local digital rights organizations or women's legal aid groups for guidance.
4. If it happens: document, report, support
If a girl you know is targeted:
Screenshot everything (URLs, usernames, timestamps)
Report to the platform immediately (most have deepfake/non-consensual image policies)
Report to the school and SRO (School Resource Officer) AND local law enforcement (even if local laws are weak, create a paper trail)
Contact organizations that support victims (digital rights groups, women's safety orgs)
Most importantly: believe her, protect her privacy, and don't let shame silence her.
In Florida, the FSS 784.049 (Florida State Statute) addresses the crime deep fake images Here:

Read the FBI's warning about Deep Fakes: Click HERE