Ending the deepfake threat to elections before it starts

A woman in Washington, D.C., views a manipulated video on Jan. 24, 2019, that changes what is said by former President Donald Trump and former President Barack Obama, illustrating how deepfake technology can deceive viewers. Deepfakes could become a regular part of election advertising unless legislation passes to regulate fake video and audio.

A woman in Washington, D.C., views a manipulated video on Jan. 24, 2019, that changes what is said by former President Donald Trump and former President Barack Obama, illustrating how deepfake technology can deceive viewers. Deepfakes could become a regular part of election advertising unless legislation passes to regulate fake video and audio. ROB LEVER/AFP via Getty Images

 

Connecting state and local government leaders

Officials try to get ahead of misinformation created by AI before the 2024 elections.

A reporter’s voice intones that Joe Biden has won the 2024 presidential election. The video hard-cuts to military operations, sirens and a Chinese invasion of Taiwan before depicting boarded-up storefronts, military patrols in the streets and waves of immigrants flooding the U.S.’s southern border. The connection between a second Biden term and global chaos couldn’t be more clearly delineated.

This April ad from the Republican National Committee is among the most prominent recent examples of artificial intelligence being used to influence the political process. And with generative AI technology far outpacing the legislation to regulate it, there are growing concerns about the impact it could have on voters being misled about candidates – and the electoral process overall. 

“We’re talking about deepfakes that are designed to fool people,” Robert Weissman, president of the nonprofit consumer advocacy organization Public Citizen, told City & State. “Permitting deepfakes will undermine political discourse.”

Weissman isn’t the only player focused on the dangers AI poses to the electoral process: Even as the Federal Election Commission and lawmakers debate a path forward, state and local officials are trying to combat misinformation targeting both candidates and election operations. 

Generative AI creates its center stage

Misinformation efforts targeting political campaigns and election operations aren’t new. However, with the evolution and proliferation of generative AI – artificial intelligence capable of generating text, images or other media, using generative models – parsing through the waves of misinformation and trying to debunk it all seems like a herculean task for both voters and election officials. 

Sam Chen, a Republican political strategist, told City & State that the advancements in deepfake technology – where bad actors utilize AI to take existing recordings to generate realistic, false audio and video – means that voters will continue to have a tough time discerning fact from fiction as they scroll past contrasting content and stories. 

“We’ve always used every tool in our arsenal” to create a narrative around a candidate, Chen, who has also given a series of lectures on political and media narratives at colleges and universities around the country, said. “You can kind of see this in finding the worst photo you can find of a person. We can make it look grainy or black-and-white and we can do Photoshop. AI is the newest entry into that.”

AI and deepfake technology have already begun to have an impact, muddying the media waters and allowing unverified information channels to spread AI-generated photos and videos without confirmation on whether the content being shared is real or fake. 

There are both nonprofit and public organizations attempting to combat misinformation, particularly around elections. One such entity is the News Literacy Project, a nonpartisan education nonprofit seeking to build a national movement to advance the practice of news literacy. 

Peter Adams, senior vice president of research and design at the News Literacy Project, recognized the speed and scale at which false information can be spread, adding that the loudest voices in the room are often the ones that get heard. 

“Some of the social and digital tools that we use are really optimized for engagement, so the most outrageous opinionated stuff is vying for our attention along with misinformation and disinformation when we’re on social media,” Adams told City & State. “We need to be really deliberate and not just sort of hand over our media diet to the algorithms … there are a lot of murky examples and there’s more hyper-partisan stuff masquerading as news than ever before.”

Given the ways in which many social media platforms gather posts and recommend content to users based on their interests, the natural rabbit holes that platforms like X, formerly known as Twitter, can send people down can confirm existing biases – creating even more of a cause for concern among advocates and researchers. 

“(We) might see the use of large language models to spread false claims about the election process and potentially game recommendation algorithms on social media platforms to kind of create fake news websites – including local news websites or spoof election office websites,” Mekela Panditharatne, who serves as counsel for the Brennan Center’s Democracy Program, told City & State, adding that there are “changes in (AI’s) speed, scale and sophistication that, when taken in the aggregate, could produce significant changes in the landscape.”

Falsehoods that are repeated – such as those related to former President Donald Trump’s election fraud claims in 2020 – tend to stick regardless of their accuracy. This phenomenon, Adams said, is known as the illusory truth effect. 

“When you see a photo of a political candidate in a compromising position, or you see a photo of Trump in an orange jumpsuit, it sticks in a way that I think text doesn’t,” Adams said. “If you see a false claim or a false image repeated over and over again, some part of it will stick … The rise of synthetic visuals and synthetic media is deeply concerning and really upends our notion of what counts as evidence.”

Debunking repeated falsehoods about a particular candidate is the responsibility of the campaign. But as incumbents and challengers take jabs at each other and craft their own narratives, election officials are becoming increasingly concerned about the falsehoods being spread about election operations – everything from the time and location of a polling place and Election Day to the legitimacy of mail-in ballots and drop boxes.

Jake Dilemani, a Democratic consultant with Mercury’s New York office, told City & State that campaigns will always be “behind the eight ball” when it comes to fact-checking social media in real time. 

“It’s no good if an ad that is deliberately deceptive goes out and no one knows that’s the case until two weeks later,” Dilemani said. “Two weeks in the campaign cycle is a lifetime.” 

Secretary of the Commonwealth Al Schmidt, who endured the fallout from election falsehoods while serving as Philadelphia’s Republican city commissioner in 2020, stressed that sharing information must be a primary function of elected officials, not a support function. 

“Most people aren’t necessarily following all this closely. Most, at least in Pennsylvania, are voting in person on new voting systems, or they’re voting by mail, which is only a couple of years old in Pennsylvania. With all these changes, it’s no wonder that questions come along,” Schmidt told City & State. “But it shouldn’t be a surprise that bad-faith actors are seeking to exploit people having those questions to mislead them and undermine confidence in results when they lose.”

Policing AI

While policymakers may not be looking to ban AI or deepfake technology altogether, they are striving to rein it in at a time when the public is susceptible to an increasing amount of misinformation. 

“With our biases, plus our short memories as voters, things are just going to get worse. This is an open season for people who use AI – they don’t even need deepfakes. You can use Photoshop and fake news stories, it’s just that the more AI you use, the more convincing it becomes,” Chen said. “The great challenge is going to be: To what degree do we have the legal authority to regulate it?”

Regulatory talks at the federal level are already underway, but nothing is certain as the Federal Election Commission weighs both its authority and a realistic path to policing deepfake technology. 

The progressive consumer rights advocacy group Public Citizen called on the two major political parties and their presidential candidates to pledge not to use 

generative AI or deepfake technology to mislead or defraud voters during the 2024 election cycle. The group also petitioned the FEC earlier this year to issue a rule to clarify the law against “fraudulent misrepresentation” and how it applies to AI. 

Weissman said the actors spreading misinformation online seek not only to confuse voters about particular topics and candidates but also to ingrain an overall sense of distrust in the election process. 

“The prospect of widespread deepfakes threatens to – very consequentially – undermine political discourse and speech in two ways: by tricking people into thinking things happened that haven’t happened, but also by making it possible for candidates or other political figures to deny things that actually did happen,” Weissman said. “The impact of those two factors combined is to sow political mistrust, diminish actual political debate and leave people kind of helpless against competing claims – where all you can do is revert to your political tribe.”

The FEC’s unanimous procedural vote in August advanced Public Citizen’s petition, with a 60-day public comment window opening later that month. Public Citizen proposed giving candidates the option to prominently disclose the use of AI rather than having them avoid using the technology in campaign ads altogether. 

Weissman said he expects the FEC to make a decision by the end of October on whether to proceed with a rulemaking process. From there, the FEC would propose a rule and vote on it in the near future. 

Chen, who supports a ban on the use of deepfake technology in campaign ads, said the FEC’s challenge is to find a balance between regulating the technology wisely while not infringing upon freedom of speech. 

“The FEC gives campaigns a lot of leeway … but they’re not allowed to outright lie about something. You can make the argument that something like a deepfake would be an outright lie,” he said. “It’d be tricky legally to ban it outright. But I certainly think (an interpretation) along those lines would be within the sphere of (the FEC’s) current regulations.” 

The FEC’s authority is limited, however. Even if the commission were to clarify the rulemaking and make a firm decision on the use of AI in campaign ads, it does nothing to stop outside groups such as political action committees from imitating a candidate and/or needing to disclose the use of AI in their ads. 

For the blatant misinformation falling outside of the FEC’s purview, state and local officials are attempting to both connect individuals to proper resources and debunk misinformation already being spread about elections and voting methods. 

The Answer to AI

Outside of the legal sphere, local and state officials are utilizing existing tools to combat bad actors and their growing digital toolbox. Over the summer, Gov. Josh Shapiro signed an executive order creating an AI governing board. The state’s first generative AI working group will help state agencies find ways to use AI to improve government services while also establishing guardrails for use within the public sector. 

Panditharatne said that as bad actors begin to improve in their usage of AI, so should government entities. “Developing smart and scalable moderation policies for AI-generated content and this new landscape will be critical,” she said. 

Michael Sage, the chief information officer for the County Commissioners Association of Pennsylvania, agreed.

“If (the AI board) can produce materials that are reusable guidance for the entire commonwealth, that’s going to be invaluable because everyone’s facing the struggle,” Sage told City & State. “How do we use this? And how don’t we use it?”

Schmidt and Panditharatne also boosted the concept of “pre-bunking” – identifying the strategies and trends that misinformation machines follow and getting accurate information out to those channels ahead of time. 

“It’s helpful because, to some extent, election officials should know some subset of what false narratives are likely to gain traction in the next election and forthcoming elections. They can put out materials that clarify important details about election security and about the election process,” Panditharatne said. 

Schmidt shared similar thoughts, noting that secretaries of state have clear lines of communication with each other and federal partners. Looking toward the commonwealth’s elections in 2023 and beyond, the onus falls on election officials at every level to use their established networks to inform the public of what new information – true or false – is popping up online. 

“It’s a matter of sharing factual information, doing so repeatedly and getting other voices to amplify it as best you can,” Schmidt said. “You can’t necessarily shut down people from saying all sorts of things on social media. But it’s important for us to tell the truth. I think the truth is the only antidote to the lies out there.”

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.