Pennsylvania bill seeks to close ‘loophole’ for AI-generated child sexual abuse materials

Fiordaliso via Getty Images
“Children are now targeted in ways we never thought possible.”
This article was originally published by Pennsylvania Capital-Star.
A two-hour state Senate committee hearing explored the use of artificial intelligence to create images and videos depicting children — sometimes real, sometimes “synthetic” — in sexually suggestive poses or acts.
Possession of these kinds of images is already illegal in Pennsylvania, but the technology continues to develop faster than legislation can keep up. Protecting children “from the harmful aspects of AI” is an ongoing goal for Sen. Tracy Pennycuick (R-Montgomery), whose newly authored bill was the focus of Monday’s Senate Majority Policy meeting.
“We want to make sure our children are protected from bad actors,” said Pennycuick. “Children are now targeted in ways we never thought possible.”
Mandated reporters in the state — including health care practitioners, teachers, clergymen and more — already must report suspected child abuse, which includes explicit materials, to law enforcement. Senate Bill 1050, would expand that to include content made by AI and builds upon previous laws that targeted deepfakes.
“Every day, I see the lifelong impact of what happens when a child’s body or image is misused,” said Leslie Slingsby, the CEO of services and operations at Mission Kids Advocacy Center in Montgomery County. “Today, technology is creating new ways for that harm to occur, even when a child was never physically present.”
Mission Kids provides specialized forensic interviews and trauma-informed therapy for children, pivoting to address AI-generated materials more recently. Not including so-called synthetic children — or images not based on one, unique child — throughout the state’s code is a “dangerous” loophole, Slingsby noted, that could allow consumers of these materials to work or volunteer with children.
“Bringing AI-generated (child sex abuse materials) under mandated reporting law would, one, align in the spirit of child protection by recognizing that exploitation is exploitation, no matter how it’s created,” said Slingsby. “Second, gives investigators the authority to intervene early … and third, ensures protection for real children whose images are used, even if altered or fabricated.”
In December, two Lancaster County juveniles were charged for creating dozens of AI-generated pornographic images of other minor students and a Buck County student was charged with a similar crime in July.
In both incidents, Pennycuick said mandated reporters “failed these students,” and one school allegedly contacted the perpetrator’s father, but not law enforcement.
Tool For Law Enforcement
Angela Sperrazza, the chief deputy state attorney general with the General Child Predator Section, shared that just one referral for AI-generated child sexual abuse materials often reveals additional content.
“Creating an explicit duty for mandated reporters to notify authorities when they believe a child has been victimized through child sexual abuse materials — traditional or AI-generated images — is a critical step toward early investigation and intervention,” said Sperrazza.
“A single referral can lead to the identification of devices, online networks, service providers and offenders before further harm can occur,” Sperrazza added.
The presence of this AI-generated media “has posed significant challenges” to law enforcement and prosecutors, said Gabriella Glenning, the assistant district attorney in Montgomery County.
“Artificial intelligence has complicated the identification of (child sexual abuse material) victims, requiring substantial investigative resources to determine whether these images are authentic or digitally fabricated,” said Glenning. “It is now alarmingly simple to superimpose a person’s face onto altered images.”
Glenning added that the suspicion of sexual exploitation and sexual extortion could be added to Pennycuick’s underlying bill, which could uncover blackmail attempts. Once reported to the state’s ChildLine, an investigator can determine if images or videos meet the criteria and how it was created.
Pennycuick’s bill advanced out of the Judiciary Committee unanimously in late October and is awaiting consideration in the full Senate.




