Deepfakes may soon land creators in legal trouble
By CHRISTEN SMITH
The Center Square
HARRISBURG – Deepfake content creators may soon face legal penalties if caught using the images for nefarious purposes.
The Senate Judiciary Committee unanimously approved a bill that would define the crime as “digital forgery” and, depending on its seriousness, punish it as a third-degree misdemeanor.
Under the new provision, those who use deepfakes, or artificially created images, videos, and audio recordings that mimic a real person, to defraud or harm others can be charged. Law enforcement officers using the content while working and those who unknowingly share digital forgeries will be exempt.
“What we are seeing amounts to forgery in the digital world, an imitation with the intent to defraud, so close to reality that many cannot discern real from fake,” said Sen. Tracy Pennycuick, R-Red Hill. “The potential for widespread harm will only grow as artificial intelligence tools become more sophisticated and more readily accessible.” Pennycuick said that some forgeries have been used to steal money from seniors by impersonating friends and relatives and create misinformation with fraudulent political content, among other scams.
“This is a common- sense, balanced approach that will help protect our constituents, businesses, and institutions,” she said. “By updating our laws, we ensure law enforcement will have the tools they need to hold perpetrators accountable, while preserving the constitutional freedoms we cherish.”
It’s just the latest in a string of bills meant to limit the proliferation of deepfake content and its societal harms. Last year, a bipartisan proposal that was cosponsored by Pennycuick and Democratic Sens. John Kane and Jimmy Dillon targeted political misuses specifically.
The issue has drawn attention across the country as federal regulators weigh new limits for generative artificial intelligence technology.
For now, 14 states have enacted resolutions and pursued new laws targeting deep fakes. One such state, New Hampshire, made headlines before its Jan. 23, 2024, primary election after a robocall featuring the voice of President Joe Biden told residents to stay home and save their votes for the November general election.
Rep. Tarik Kahn, D-Philadelphia, has sponsored similar legislation in the House and said the escalation of negative ads from actual photos, videos and sound clips to deep fakes serve no purpose beyond “deception.”