Labmanager Logo
Pixelated human face represnting AI

iStock, ArtemisDiana

Deepfake Videos during Russian Invasion of Ukraine Could Undermine Trust

Thematic analysis of 2022 tweets highlights impacts of wartime deepfakes on emotions, trust, and conspiracy theories

| 2 min read
Share this Article
Register for free to listen to this article
Listen with Speechify
0:00
2:00

A new study explores themes in Twitter discussions of deepfake videos related to the Russian invasion of Ukraine, highlighting the potential for real videos to be mistaken for deepfakes and for deepfakes to fuel conspiracy theories. John Twomey of University College Cork, Ireland, and colleagues present these findings in the open-access journal PLOS ONE.

Created using artificial intelligence, deepfake videos typically feature a person saying and doing things they never actually did in real life. Deepfake technology has advanced considerably, sparking concerns about its potential harms. Deepfakes related to the Russian invasion of Ukraine represent the first instances in which deepfakes have been used in attempts to influence a war.

Lab manager academy logo

Get training in Positive Communication and earn CEUs.

One of over 25 IACET-accredited courses in the Academy.

Certification logo

Positive Communication course

To better understand the potential harms of deepfakes, Twomey and colleagues analyzed Twitter discussions about deepfakes related to the invasion. They used a qualitative approach known as thematic analysis to identify and understand patterns in the discussions, which included a total of 1,231 tweets from 2022.

The researchers found that many of the tweets expressed negative reactions to news about deepfakes. For instance, some tweets expressed worry, shock, or confusion about news related to a deepfake that falsely depicted Ukrainian President Volodymyr Zelensky surrendering to Russia. However, some tweets overlooked potential harms or had positive reactions to deepfakes directed against political rivals, especially deepfakes created as satire or entertainment.

Some tweets warned about the need to prepare for increased use of deepfakes, discussed how to detect them, or highlighted the role of the media and government in rebutting them. However, some tweets suggested that deepfakes had undermined users’ trust to the point that they no longer trusted any footage of the invasion. Some tweets linked deepfakes to users’ apparent belief in conspiracy theories, such as deepfakes of world leaders being used as cover while they were actually in hiding, or that the entire invasion was fake, anti-Russian propaganda.

This analysis suggests that efforts to educate the public about deepfakes may unintentionally undermine trust in real videos. The authors note that their findings and future research could help inform efforts to mitigate the harms of deepfakes.

The authors add: “Much previous research on deepfakes has been concerned with potential future harms of the technology. However we have focused on how deepfakes are already impacting social media as we have seen during Russia's invasion of Ukraine. Our research shows how deepfakes are undermining faith in real media and are being used to evidence deepfake conspiracy theories.”

Want the latest lab management news?

Subscribe to our free Lab Manager Monitor newsletter.

Is the form not loading? If you use an ad blocker or browser privacy features, try turning them off and refresh the page.

- This press release was provided by PLOS

Loading Next Article...
Loading Next Article...

CURRENT ISSUE - December 2024

2025 Industry and Equipment Trends

Purchasing trends survey results

Lab Manager December 2024 Cover Image
Lab Manager eNewsletter

Stay Connected

Click below to subscribe to Lab Manager Monitor eNewsletter!

Subscribe Today