If the fighting for control of Pokrovsk, in eastern Ukraine, still rages on the ground, this is not the case on pro-Russian social media, for which Moscow has already achieved victory: In viral videos generated by artificial intelligence, the Ukrainian army retreats.
Russia has been trying to conquer the logistics center located in the eastern Donetsk region for over a year, and in recent weeks the assault has been intensified, the advance towards the periphery – according to maps published by the Institute for the Study of War – carried out in a pincer movement. The clashes for control of the railway and road crossroads do not stop, but in the dozens of videos generated by artificial intelligence and circulated on social media in recent weeks – with millions of views – we can see scenes of surrender, Ukrainian soldiers handing over their weapons, others going to the front in tears.
Misinformation
The fake videos are part of a “wider narrative that we have seen since the beginning of the invasion, with Zelensky sending young and old to the front against their will”, comments Pablo Maristany de las Casas, analyst at the Institute for Strategic Dialogue, quoted by AFP. “There is always an event about which false information can be constructed,” observes Carole Grimaud, a researcher at Aix-Marseille University. The videos “exploit uncertainty and fuel doubts in public opinion”, he underlines.
In fake images there is no shortage of inconsistencies, some glaring, others less traceable to a simple observation: among the first certainly the scene of the Ukrainian soldier who claims to “leave Pokrovsk” and walks without difficulty despite having a leg in a cast, the image of a stretcher that seems to rise into the air and that of disembodied legs that materialize and vanish in the background.
Other fake videos, some bearing OpenAI’s Sora logo, show soldiers in Ukrainian uniform crying and begging not to be sent to the front. In some of these videos, the faces of Russian online streamers appear. One of them uses the image of exiled Russian YouTuber Alexei Gubanov, which appeared in a video of a crying Ukrainian soldier. “It’s obviously not me,” he later explained in a YouTube video. “Unfortunately, many people believe it… and this plays into the hands of Russian propaganda,” he lamented.
The European Digital Media Observatory, a network of fact-checking organizations funded by the EU, says it has published over two thousand articles related to the war in Ukraine since the Russian invasion in 2022, and within them artificial intelligence has become an increasingly recurring topic. Disinformation is “an old tactic, but the technology is new,” notes Ian Garner, a Russian propaganda specialist at the Pilecki Institute. The videos work by “undermining the morale of Ukrainians, saying, look, this is a person just like you, he could be your brother, your father.” In the meantime, however, they strengthen the morale of the Russians.
TikTok says the accounts that appear to be behind these videos have been deleted. However, not before one of them collected more than 300,000 likes and several million views. OpenAI said it conducted an investigation, without providing further details.
But the videos remain in circulation. AFP reports having found them, among other things, on Instagram, Telegram, Facebook and X with posts in Greek, Romanian, Bulgarian, Czech, Polish and French, on the website of a Russian weekly and in a Serbian tabloid. The impact of a fake video is difficult to measure, but “when it is repeated, it is possible that people’s perceptions change,” Grimaud said.
While some companies have demonstrated a willingness to combat misuse of their tools, Maristany de las Casas said, “the scale and impact of information warfare exceeds companies’ responses.”