Generative AI has a plethora of well-documented misuses, from making up academic papers to copying artists. And now, it appears to be cropping up in state influence operations.
One recent campaign was “very likely” helped by commercial AI voice generation products, including tech publicly released by the hot startup ElevenLabs, according to a recent report from Massachusetts-based threat intelligence company Recorded Future.
The report describes a Russian-tied campaign designed to undermine Europe’s support for Ukraine, dubbed “Operation Undercut,” that prominently used AI-generated voiceovers on fake or misleading “news” videos.
The videos, which targeted European audiences, attacked Ukrainian politicians as corrupt or questioned the usefulness of military aid to Ukraine, among other themes. For example, one video touted that “even jammers can’t save American Abrams tanks,” referring to devices used by US tanks to deflect incoming missiles – reinforcing the point that sending high-tech armor to Ukraine is pointless.
The report states that the video creators “very likely” used voice-generated AI, including ElevenLabs tech, to make their content appear more legitimate. To verify this, Recorded Future’s researchers submitted the clips to ElevenLabs’ own AI Speech Classifier, which provides the ability for anyone to “detect whether an audio clip was created using ElevenLabs,” and got a match.
ElevenLabs did not respond to requests for comment. Although Recorded Future noted the likely use of several commercial AI voice generation tools, it did not name any others besides ElevenLabs.
The usefulness of AI voice generation was inadvertently showcased by the influence campaign’s own orchestrators, who – rather sloppily – released some videos with real human voiceovers that had “a discernible Russian accent.” In contrast, the AI-generated voiceovers spoke in multiple European languages like English, French, German, and Polish, with no foreign-soundings accents.
According to Recorded Future, AI also allowed for the misleading clips to be quickly released in multiple languages spoken in Europe like English, German, French, Polish, and Turkish (incidentally, all languages supported by ElevenLabs.)
Recorded Future attributed the activity to the Social Design Agency, a Russia-based organization that the U.S. government sanctioned this March for running “ a network of over 60 websites that impersonated genuine news organizations in Europe, then used bogus social media accounts to amplify the misleading content of the spoofed websites.” All this was done “on behalf of the Government of the Russian Federation,” the U.S. State Department said at the time.
The overall impact of the campaign on public opinion in Europe was minimal, Recorded Future concluded.
This isn’t the first time ElevenLabs’ products have been singled out for alleged misuse. The company’s tech was behind a robocall impersonating President Joe Biden that urged voters not to go out and vote during a primary election in January 2024, a voice fraud detection company concluded, according to Bloomberg. In response, ElevenLabs said it released new safety features like automatically blocking voices of politicians.
ElevenLabs bans “unauthorized, harmful, or deceptive impersonation” and says it uses various tools to enforce this, such as both automated and human moderation.
ElevenLabs has experienced explosive growth since its founding in 2022. It recently grew ARR to $80 million from $25 million less than a year earlier, and may soon be valued at $3 billion, TechCrunch previously reported. Its investors include Andreessen Horowitz and former Github CEO Nat Friedman.