Government Technology: Pennsylvania Lawmakers Consider AI Deepfake Reporting Bill

The notorious 2024 case involving the use of artificial intelligence software to create nude images of Lancaster Country Day students was cited during a Nov. 10 state Senate Committee hearing on legislation to strengthen Pennsylvania’s mandatory reporting law.

Over the course of a two-hour hearing in a Montgomery County firehouse, experts in law, child welfare and technology explored the pitfalls — legal and emotional — of quickly evolving AI technologies, particularly in how they can be used to generate content depicting real and “synthetic” children in sexual poses or acts.

The Senate Republican Policy Committee was gathering input on a bill introduced by Sen. Tracey Pennycuick, R-Montgomery County, that would add AI-generated images to the types of child abuse incidents that mandatory reporters, including educators, are required to bring to the attention of child protection agencies and law enforcement. The bill is also backed by Lancaster County state Sen. Scott Martin, a key member of Senate GOP leadership.

“Artificial intelligence is here,” Pennycuick said at the hearing. “It’s not tomorrow, it’s not next week, it is already here and it’s powerful. And it has a lot of potential, but with that potential comes potential for harm as well.”

Testifying in support of expanding the state’s mandatory reporting law, Pennsylvania Family Support Alliance Chief Executive Officer and President Angela Liddle cited the Lancaster Country Day case, in which two male students were charged with multiple crimes for using AI to create nude images of 48 female students at the school, as well as 11 girls enrolled elsewhere and one adult.

Lancaster County’s District Attorney did not charge Country Day administrators for failing to report the AI-generated content when they first learned of the images in November 2023. According to the DA’s office, the possession and dissemination of AI pornography did not meet the definition of child abuse under current law.

At the time, District Attorney Heather Adams urged legislators to amend the mandatory reporting law to include the reporting of AI-generated child pornography.

“AI can quickly and effortlessly produce imagery, voices and identities that never existed in real life,” Liddle said. “This technological evolution forces us to confront a new moral reality. The line between imagination and violation has been blurred.”

The Pennsylvania Family Support Alliance is a nonprofit that trains educators on recognizing and reporting signs of child abuse and neglect.

THE NORMALIZATION WORRY

Leslie Slingsby, chief executive officer of Mission Kids, said the fabrication of images depicting children who don’t exist in real life normalizes the sexualization of children and may generate a flood of false tips to law enforcement, making it “harder to find and rescue real children who are actively being abused.”

“Our child protection framework depends on mandated reporters — our teachers, our doctors, our social workers and others acting when they see signs of abuse,” Slingsby said. “Yet many are uncertain whether AI-generated material qualifies, especially when it depicts a child that doesn’t actually exist.”

Mission Kids is a child advocacy center in Montgomery County that offers support for children who have been sexually abused — including, more recently, the victims of AI-generated images — through forensic interviews, mental health and medical referrals, and trauma-informed therapy.

Republican Sen. David Argall, serving Schuylkill, Carbon and southern Luzerne counties, asked the panel how legislation can be crafted to be flexible to accommodate fast-changing technology.

Slingsby admitted she couldn’t have predicted the advances in AI technology, and none of the panel members had a clear answer for Argall.

PARENTS NEED TO PLAY A ROLE

Liddle said regulating technology goes beyond setting age restrictions, as one legislator had suggested, and may not be easy to address through laws alone. Parents, she said, need to be educated on the changing technologies, too.

“How do we go from those people who are required by law to report and get them accurate information, to really getting education to parents, when the truth is they are the best first stop in preventing their children being harmed in the virtual world,” Liddle questioned.

TechNet Executive Director Margaret Durkin, who runs the national network of technology company executives, also stressed the importance of parental involvement.

“Like preventing and removing (child sexual abuse matieral), we conceptually agree with the intent of proposed chatbot legislation: to create strong, sensible guardrails for children using AI companion chatbots,” Durkin wrote in her testimony for the hearing. “However, it is vital to maintain the balance between consumer protection and business innovation.”

Chief Deputy Attorney General Angela Sperrazza, who specializes in child protection law, called the legislation Pennycuick proposed “essential.”

Sperrazza heads the Child Predator Section of the Attorney General’s Office, which was created in 1995 to identify and arrest individuals for viewing or distributing sexual abuse material.

AI-generated child sexual abuse material creates a trauma for children that is perpetuated every time the images are shared, she said.

“This bill aligns our child protection laws with our technological realities,” Sperrazza said.

From Government Technology, November 17, 2025

Leave a Reply

Subscribe To Our

Mailing List