Sen. Tracy Pennycuick (R-24)
In August, Lancaster County police launched an investigation into a disturbing case involving 20 high school female students. The perpetrator took these teenage girls’ real pictures and used artificial intelligence (AI) technology to generate nude “deepfake” images and distributed them on the internet. Despite the clear harm caused, the district attorney pointed out a critical problem: a gray area exists in the law that prevents charges from being filed in cases like these.
This incident is far from unique. We are witnessing a troubling rise in AI-generated sexual images of both minors and non-consenting adults. This technology can be used to create photos and videos to depict individuals in explicit scenarios that never occurred with astonishing and nearly indistinguishable accuracy.
Unfortunately, these deepfake images are not explicitly covered by existing state laws, including our child sexual abuse statutes.
Currently, for example, it is not illegal for a friend, colleague, or even a stranger to take photos from someone’s public social media profile, use AI to create explicit content, and then distribute the “deepfakes” online. Shockingly, some websites have even published realistic AI-generated sexual images of children.
As AI technology advances, it offers significant benefits to our daily lives, from healthcare innovations to improving transportation and business operations. But with this progress comes serious risks and unintended consequences. The National Institute of Standards and Technology has already called for federal standards to address the potential misuse of AI. However, U.S. Congress has yet to fully address the dangers posed by AI-generated content.
Here in Pennsylvania, as chair of the Senate Communication and Technology Committee, I introduced Senate Bill 1213 to address the alarming rise of AI-generated deepfake sexual images of children and non-consenting adults. Although current state law prohibits the distribution of intimate images without consent, it does not clearly address the use of AI deepfake technology. This loophole leaves many Pennsylvanians vulnerable to a new form of digital abuse, as seen in the recent case in Lancaster County.
The bill also explicitly prohibits the use of AI to generate child sexual abuse material—previously referred to as “child pornography.” With the changes contained in SB 1213, law enforcement will now have the ability to prosecute individuals who generate and disseminate these types of child sexual abuse materials.
Last week, the Pennsylvania legislature (or Pennsylvania General Assembly) passed Senate Bill 1213. For the first time in Pennsylvania’s history, legislation will be presented to the governor to combat the prevalent and highly disturbing “deepfake” images of minors and child pornography generated by artificial intelligence.
This bipartisan effort has garnered widespread support, including from the Pennsylvania Office of the Attorney General and the district attorneys throughout the commonwealth. We anticipate the governor will sign this critical legislation into law soon.
AI technology has incredible potential for good, but it can also be exploited. Pennsylvania needs strong laws to protect its citizens from those who use this technology to generate sexual images without consent, particularly child sexual abuse materials. With the passage of SB 1213, we are sending a clear message: the insidious use of AI to harm others will not be tolerated in our state.
And most importantly, innocent victims, like the high school girls in Lancaster County, will be able to seek justice.