Meta’s Oversight Board Blasts Facebook Policy Allowing Altered Clip of Biden to Stay Online. How Could Disinformation Spread Impact the Upcoming Presidential Election?

Meta’s Oversight Board said it recently decided to uphold the social media company’s decision to leave on Facebook a video, first posted last May, that was altered to make it seem as though President Joe Biden is touching his adult granddaughter inappropriately.

The board criticized Meta’s manipulated media policy in the decision, stating it was “incoherent” and “inappropriately focused on how content has been created, rather than on which specific harms it aims to prevent,” using electoral processes as an example.

With such content floating around the web, some experts — and the Oversight Board itself — are concerned about the impact of the spread of misinformation with election season well underway. 

Commenters roast Biden for the amount of candles he has in his 81st birthday cake.
Commenters roast Biden for the amount of candles he has in his 81st birthday cake. (Photo: @joebiden/Instagram)

“Meta should reconsider this policy quickly, given the number of elections in 2024,” the Oversight Board said in its decision on the Biden video.

The seven-second clip is based on actual footage of the president in October 2022 as he voted in person during the U.S. midterm elections.

The original, unaltered clip showed Biden exchanging “I Voted” stickers with his adult granddaughter, who was voting for the first time. The clip shows him “placing the sticker above her chest, according to her instruction, and then kissing her on the cheek,” according to the Oversight Board.

The altered version is placed on a loop that repeats Biden’s hand making contact with his granddaughter’s chest to make it look like he’s touching her inappropriately. 

The edited clip also featured a lyric from the song “Simon Says” by Pharoahe Monch with a caption referring to the U.S. president as a “sick pedophile” and describing voters who supported him as “mentally unwell,” the Oversight Board said. 

“Other posts containing the same altered video clip, but not the same soundtrack or caption, went viral in January 2023,” according to the board.

Why Did Meta’s Oversight Board Approve the Altered Video of Biden?

A user reported the post to Meta as hate speech, but the report was automatically closed without being reviewed. After the user appealed the decision, a human reviewer determined it wasn’t a violation, so the post remained online and the user then appealed to the Oversight Board.

The board’s decision on Feb. 5 said the Facebook post doesn’t violate Meta’s manipulated media policy, which only applies “to video created through artificial intelligence and only to content showing people saying things they did not say.”

The Biden clip wasn’t altered using AI and featured something he did not do rather than something he did not say, explained the board, meaning it didn’t violate the existing policy. The board also said the clip’s edit is obvious and therefore not likely to mislead the “average user” of whether it’s authentic, “which, according to Meta, is a key characteristic of manipulated media.”

Experts have said that while election-related manipulated videos and images have existed before, this year’s U.S. presidential election marks the first during which AI tools that can quickly create fake yet convincing content are so readily available, the Associated Press reported.

AI-generated audio recordings were released in Slovakia days before the country’s November elections last year, according to the AP. A liberal candidate can be heard talking about election rigging and hiking up the costs of beer in the fake clip, which was spread across social media even as fact-checkers rushed to prove it wasn’t real, the AP reported.

Biden said at a news conference last October that he’d seen other examples of “deep fake” clips featuring himself. “I said, ‘when the hell did I say that?’” the president said, according to NBC News.

 Last October, Biden’s administration announced an executive order on establishing best practices and standards for detecting AI-generated content. The Department of Commerce, according to the order, will develop content authentication and watermarking to label AI content, the White House said.

In the Oversight Board’s recent decision on the questionable Biden clip — which wasn’t edited with AI — the board said experts it consulted and public comments agreed that content not altered with AI is prevalent and not any less misleading than clips edited with AI.

“For example, most phones have features to edit content. Therefore, the policy should not treat ‘deep fakes’ differently to content altered in other ways (for example, ‘cheap fakes’),” the board said. 

How the Spread of Disinformation Could Impact the Upcoming Presidential Election

Irina Raicu, the internet ethics program director for Santa Clara University’s Markkula Center for Applied Ethics in California, told Atlanta Black Star there’s a concept that could dispute Meta’s claim that people won’t believe the altered Biden clip is real. 

“Meta seems to say that people won’t actually believe this, people will know that this is a joke or or an exaggeration, but there’s a concept called availability cascade – which basically says if you keep hearing the same thing over and over again, or in this case, see the same thing over and over again, you start to feel like there’s something there just because you’ve come across it so many times,” Raicu said.

The expert said she wondered if the social media company had conducted research to suggest that the altered clip is, in fact, not believable.

“I don’t think it’s true that people really don’t believe it, especially if they come across something like an accusation like this over and over again,” she said.

The concern about the election-related impacts could go beyond whether videos like the altered Biden clip affect how people vote in November — but rather if they even vote at all, according to Raicu.

“If they are really confused or if presented with a lot of negative information about the people they intended to vote for, they might just stay home, so that’s a big initial problem,” Raicu said.

“We know that it’s often very hard to debunk things, and often, the misinformation is spread much more widely than the debunking and efforts to correct it,” she said.

Part of the problem with combating the rampant misinformation spread, according to Raicu, is that companies haven’t invested enough in efforts to fight it.

“I think that the teams that are entrusted with developing the policies, and then especially with implementing them, are under-resourced,” she said. 

The fact that Meta’s Oversight Board calls out the manipulated media policy suggests that there’s room for improvement in such policies, Raicu explained.

“That there can be a lot clearer policies, and policies that are not so narrowly drawn that they leave out, you know, 80 percent of the misinformation that we’re hoping to have covered,” she said.

Back to top