fbpx
20

Could Your Teen Become The Subject Of AI Generated Child Pornography?

Do you ever wonder if parents in the pioneer days worried about their teenagers? Like did they worry about them riding their horses too fast? Maybe getting caught in the barn with the preacher’s daughter? Goofing off instead of killing a deer for dinner? I mean I don’t know – I’m sure they had concerns. Probably diseases or dying of starvation as they “pioneered”.

But it just seems so unfair that there’s just no end to things you have to worry about today. And I honestly feel guilty sometimes even talking about it because it’s like Good God, one more thing to add to the already long list. I suppose you have to know, right? Maybe I’m saying, don’t shoot the messenger, okay?

This is Speaking of Teens, the podcast that that teaches you the science of parenting adolescents so you can be less stressed and more excited about having a teenager. I’m Ann Coleman, I’m an attorney turned parent educator and I’ve spent years studying the science of teen behavior and I want to help you learn how to parent your teens for the best possible outcome.

Today I want to talk briefly about AI generated porn.

I’ve touched on a couple of these issues in recent newsletters (so, if you don’t get the newsletter or don’t open it, you’re missing out – just go to speakingofteens.com and scroll down the page and sign up – and if you gave a junk email address previously, I can assure you, you want this weekly newsletter so go, sign up again!)

These are issues we’re seeing in the US and I’m quiet certain if you’re in Canada, Australia, the UK, New Zealand, Germany – wherever you live in the world, the same thing is happening there. Being aware and prepared to discuss these things as you find a way in, could make a huge difference.

Let’s start with AI porn. When ChatGPT made it’s big debut, schools were scrambling around trying to figure out how to keep kids from cheating. But as it turns out, that’s not been much of an issue at all.

However, as kids will do, they figured out even more risky endeavors for AI. As these new AI powered image creators hit the market (like Midjourney or Da Vinci) other app creators were coming up with ways that you could put in a photo of someone and manipulate it to change then into a cartoon character or change their clothes or take off their clothes.

One New York Times journalist calls these especially offensive programs, “nudification” apps. They literally take a regular picture of someone and replace the clothing with what one would imagine is underneath.

But there are other programs that will just take the face of someone you upload and put it onto a porn star in a photo or video – some people call this “deep fake” or “deep nudes”.

These apps have gotten more than one teenage boy in hot water in this country. This past October some boys at Westfield High School in New Jersey, either took photos of girls at their school or copied them from somewhere else, and generated these AI nudes or deepfake pornography, if you will. But they didn’t stop there – they shared the photos among themselves in group chats.

Now at the time this was reported back in November, by MSNBC, the article stated there was no federal law that addresses AI-generated nudes.

However, the FBI issued a Public Service Announcement just this past month entitled “Child Sexual Abuse Material Created by Generative AI and Similar Online Tools is Illegal”. The FBI says, that these AI generated photos are indeed “child sexual abuse material” (otherwise known as CSAM – but what most of us still know as child pornography).

The FBI went on to point out that Federal law does prohibit the “production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession” of these very realistic but computer-generated images.

This FBI announcement goes on to point out that generative AI using simple text prompts have made it extremely simple for almost anyone (including teens) to make CSAM and they are taking it seriously.

For example, a child psychiatrist in Charlotte, North Carolina was using a web-based AI app to take images of kids who were fully dressed and turn them into kiddy porn. He’s been sentenced to 30 years in prison, followed by 30 years of supervised release, thank God. And yes, he recorded and made photos of his patients and recorded at least one taking a shower during an outpatient visit, somehow. Just unthinkable.

Another fellow from Pennsylvania, who was already a registered sex offender was convicted of having CSAM of child celebrities. And again, this is AI generated – the faces of child actors were placed on naked bodies or naked bodies engaged in sexual acts.

The scary thing is that machine learning models of AI applications, generate new images from studying actual images – data sets of images – like explicit photos. So, the apps can take that information and make brand new images from fusing together bits from that data set. And it doesn’t take a warehouse sized computer to do this either – people can set up something at home with computers from Best Buy. And this stuff has gotten so good that it can generate any type of sex act based on a simple text prompt that is virtually indistinguishable from the real thing.

And what’s disgusting is there are underground, online, dark web community of sickos who are training AI with real CSAM materials, which will soon (if not already) be producing very real-looking sexually explicit images of children and teenagers.

Now, the problem is, a lot of school administrators apparently don’t know what to do with this issue. The mother of one of the girls who was victimized at Westfield High says very little has been done by the district to address what happened to the girls and that school policies have not even been updated to try and prevent the same things from happening in the future. Another similar story quoted school officials as saying it was a police matter.

Just yesterday there was a story in the Los Angeles Daily News about how unprepared schools and the legal system are for all the AI nudes floating around. The story cites reports of boys generating AI nudes of classmates from both middle and high schools from Aliso Veijo to Laguna Beach to Beverly Hills. And everyone from police to school administrators are admitting they’re behind “way behind the curve” – they just don’t have policies or laws or regulations they can hang their hat on locally.

The story quotes a Newport Beach psychotherapist who says she just doesn’t think schools or police actually understand how psychologically harmful this type of deep fake is. She says she’s reached out to police to make reports from her patients, and they’ve not yet even taken it seriously or even understand what it is.

Then there’s the law; an associate dean from Loyola Law School says that most states’ child pornography laws do not mention computer generated deep fake nudes. These laws were not written when anything like this was even imagined as a possibility in the future – no one saw this coming until it was too late. So, child pornography laws are about protecting a child from actual abuse – from photos being made in the process of abusing that particular child.

And this is also where it gets hairy at the federal level. Back in 2002 in Ashcroft versus the Free Speech Coalition, the Supreme Court struck down a provision of the Child Pornography Prevention Act as being overbroad and in violation of the First Amendment’s Guarantee of Freedom of Speech.

At that time the federal law defined child pornography as basically any photo, image or video of any kind, including a “computer or computer-generated image or picture” that “is, or appears to be, of a minor engaging in sexually explicit conduct.”

So that could easily be interpreted to include something like AI-generated or fake sexual images of children.

But the Supreme Court said nope – it’s not child pornography unless a real child was harmed in the making of the images. The whole purpose of the law is to protect children and if the images are fake, no children were harmed. So, at that point the Court basically said, you can produce all the virtual CSAM you want as long as no actual children are hurt – Freedom of Speech.

After that opinion Congress enacted the PROTECT Act and revised the definition of child pornography to add “a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct.” Did you catch that, they changes appears to be to indistinguishable from. Can you hear me rolling my eyes?

That’s why practicing law is maddening.

This definition has not been challenged yet so right now this appears to be what the FBI is banking on. That these AI images, created by a computer, are literally indistinguishable from real kids.

Now as far as the states go, there are less than a dozen that have any laws on the books that ban nonconsensual deepfake porn of any kind. For some of those states those laws are criminal but it others it simply allows the violator to be sued civilly.

So at this time, in most states, there are simply no state laws local officials can work with to do anything about minors acting against other minors – even the bullying or cybercrimes laws usually require some level of threat to personal safety.

But school districts should already have policies and codes of conduct in place that incorporate this type of behavior – it would probably be considered harassment.

So, talk to your son, talk to your daughter, make sure they know that any type of behavior like this is not only cruel and harassment but they might find themselves suspended or expelled, and extremely embarrassed. Also, let them know that if they see anything like this happening, not to engage with the person doing it, don’t look at it, don’t share it or comment on it and to let you know right away.

And if it happens to your child, someone makes a deepfake nude image of them, skip right over the principals office and go straight to the school board. And demand that the person or persons who made or shared the images turn over their phones to be forensically examined to determine with whom or on what platforms those images were shared. (okay, that’s my extra 2 cents but if it were my child, I’d at least try that – get an attorney to make the demand for you). And then you can also contact the Center for Missing and Exploited Children for help in removing the images from the internet. I’ll have that link and all the other resources in the show notes.

Okay, that’s it for Speaking of Teens today. I’m so glad you were here and hope if you got something out of this episode you’ll go listen to a few others and that you’ll share the podcast with other parents.

And if you enjoy the show, you’ll really love Parent Camp, a community of parents learning together through the Field Guide for Teens, meeting with me weekly, learning from other expert guests and more – check it out through the link at the bottom of the episode description where you’re listening.

Alright, until next time, remember, a little change goes a long way.