state

Ohio measure would ban deepfakes before an election

By Ohio.news on Dec 04, 2024

Ohio lawmakers could soon consider legislation regulating the dissemination of deepfakes to influence an election, a measure that could lead to an uptick in defamation lawsuits.

House Bill 410 comes amid an ongoing national debate about how deepfakes could influence elections at every level.

State Rep. Joe Miller, D-Amherst, who introduced the bill, said Ohio lacks a legal definition of deepfakes and how they should be regulated in elections. Miller's bill defines “deepfake media” as images, audio or video that appear to show a real person speaking or acting in a way that the person did not.

“Artificial Intelligence poses a unique and unprecedented threat to the sanctity of our elections,” Miller said in sponsor testimony.

Thomson Reuters reported this summer that “there is no comprehensive enacted federal legislation in the United States that bans or even regulates deepfakes.”

According to The National Conference of State Legislatures,, most state laws related to deepfakes target “sexually explicit or pornographic video images, with some expanding existing nonconsensual intimate image laws.” However, state lawmakers have started to bar distributing “deceptive audio or visual media with the intent to injure a candidate’s reputation or to deceive a voter into voting for or against a candidate.”

 The group said at least 40 states had pending legislation in the 2024 legislative session, and lawmakers have enacted at least 50 bills. In September, California Gov. Gavin Newsom signed a trio of measures that aim to “remove deceptive content from large online platforms, increase accountability, and better inform voters.”

The measure would require anyone who knowingly creates and disseminates deepfakes to influence an election to disclose it in the media and specifies disclosure requirements for images, audio recordings, and videos. It also bans anyone from knowingly creating and disseminating deepfakes to influence an election from 90 days before the day of the election and ending on election day.

The proposal allows a person harmed by a violation to sue the violator for compensatory and punitive damages and pursue other causes, such as a defamation lawsuit.

“A properly informed and seldom deceived voting base is the most important element in ensuring that the strength of our democracy does not falter,” Miller said. This “is why we should follow in the footsteps of states such as Texas, Mississippi, Michigan, New Hampshire, Wisconsin, and Utah and put restrictions on the distribution and publication of election-related audio and visual material created using AI.

“With public faith in democratic institutions fading in this current political atmosphere, constituents should be protected from misinformation and technological deception,” Miller added. “A lot of our older or less technologically adept constituents already have trouble navigating the growing sprawl of news media on new technological frontiers, and the proliferation of deep fakes going unregulated will leave one of our most important voting blocs vulnerable.”

Brent Skorup, a legal fellow at the Cato Institute’s Robert A. Levy Center for Constitutional Studies, wrote in Reason arguing that deepfake crackdowns threaten free speech.

“From vague terms ('deepfake,' 'disseminate') to harsh criminal penalties, these laws clash with First Amendment protections, especially since they fail to exempt parodies or satire,” Skorup wrote. “...In the words of a federal judge, these deepfake laws often act ‘as a hammer instead of a scalpel,’ chilling far too much speech.”

 In response to Texas’ petition for discretionary review to determine whether its law passes constitutional muster, the Texas Court of Criminal Appeals found that it does not.

“Given that influencing elections is the essence of political speech, it is difficult to imagine what speech would not be included under the statute including neutral statements ... and true statements,” the court ruled. “And while false statements or parody ... are not exempt from this statute, those statements are probably protected under the First Amendment.

“...This language is too broad and encompasses too many statements that have the potential to influence the democratic process,” the court added. “A citizen should not have to rely on the sense of humor, kindness, or leniency of prosecutors when discussing or sharing opinions and messages regarding governmental affairs.”