
HARRISBURG – The Senate Judiciary Committee voted today to better protect young people against the serious threats posed by child abuse materials generated using artificial intelligence (AI).
Senate Bill 1050 – sponsored by Senators Tracy Pennycuick (R-24), Scott Martin (R-13) and Lisa Baker (R-20) – would require mandated reporters to report all instances of child sexual abuse material (CSAM) they become aware of, including those produced by a minor. This applies to teachers, child care workers, health care providers, and other individuals responsible for caring for children.
The senators noted there has been a startling increase in the amount of AI-generated CSAM being created and shared in recent years, including troubling cases in school settings.
“Staying ahead of criminals who use AI for nefarious purposes requires we remain vigilant,” said Pennycuick. “We must ensure our kids are protected from the dangers of CSAM. Mandated reporting of such incidences is an important step forward.”
“We’ve worked with numerous different stakeholders and advocates for many months to better protect young people from these emerging threats online,” Martin said. “I am encouraged to see this bill moving forward so we can ensure all mandated reporters understand their responsibilities to turn over any suspected cases to the authorities for investigation.”
“As in many areas of criminal law, we will be constantly challenged to provide sufficient prosecutorial power for law enforcement to keep current with new avenues of criminal conduct,” Baker said. “This bill seeks to slam shut a door of vulnerability for young people, thus becoming an important piece in our ongoing efforts to protect them and prosecute the wrongdoers.”
The legislation was amended with additional language to ensure potential reporting gaps are closed in the future.
In the past year, the Senate has already taken important steps to combat the issue of sexually explicit materials created through AI by passing Act 125 of 2024 and Act 35 of 2025, which addressed deepfakes and sexual deepfakes. Senate Bill 1050, a bipartisan proposal, would build on these accomplishments and ensure cases involving CSAM are reported and investigated promptly.
Senate Bill 1050 was sent to the full Senate for consideration.
CONTACT:
Liz Ferry (Senator Pennycuick)
Jason Thompson (Senator Martin)
Jennifer Wilson (Senator Baker)


