Pro-Russian disinformation campaign is using free AI tools to exacerbate the “content explosion”

According to new research released last week, a pro-Russian disinformation is an introduction to using consumer artificial intelligence tools to drive a “content explosion” that focuses on exacerbating existing tensions around global elections, Ukraine and immigration, and other controversial issues.
The sport, known for its many names, including Operation Overload and Matryoshka (which other researchers also associate with Storm-1679), has been operating since 2023 and aligned with the Russian government with multiple groups including Microsoft and the Institute for Strategic Dialogue. The movement spreads false narratives by sowing media in democratic countries by impersonating the obvious purpose of media media. Although the event targets a global audience, including the United States, its main goal is Ukraine. Hundreds of AI-manipulated videos of the campaign attempt to drive a pro-Russian narrative.
The report outlines the significant increase in content among people running the campaign between September 2024 and May 2025 and has received millions of views worldwide.
In their report, the researchers identified 230 unique content, including images, videos, QR codes and fake websites, between July 2023 and June 2024. The researchers say that over the past eight months, Operation Opload Over Locer has exhausted 587 unique content, most of which were created with the help of AI tools.
The peak of content is driven by consumer-grade AI tools that can be used online for free, the researchers say. This simple access helps drive the “content merge” strategy of the campaign, where people are able to produce multiple content through AI tools to drive the same story.
“This marks a shift toward more scalable, multilingual and increasingly complex advocacy strategies,” researchers from London-based nonprofit Reset Tech wrote in the report. “The campaign has greatly increased production of new content over the past eight months, indicating a shift towards faster, more scalable content creation approaches.”
Researchers are also shocked by the various tools and types of tools pursued by the movement. “What surprised me was the diversity of content, the different types of content they started using,” Aleksandra Atanasova, chief open source intelligence researcher at Reset Tech, told WIRED. “It was like they’ve diversified the palette to capture different angles of these stories. They layered different types of content.”
Atanasova added that the campaign does not seem to use any custom AI tools to achieve its goals, but instead uses AI-powered voice and image generators that have access to everyone.
While it is difficult to determine all the tools used by campaign operators, researchers are able to narrow down to one tool: Flux AI.
Flux AI is a text-to-image generator developed by Black Forest Labs, a German company founded by former employees of Stable AI. The researchers used Sigeengine image analysis tool to find a 99% chance, with many of the fake images shared by overload activity (some of whom claim to show Muslim immigrants setting fires in Berlin and Paris) being images generated using Flux AI.