Voters in Germany are facing an unprecedented wave of far-right narratives and disinformation ahead of the crucial election that will determine the composition of the new Bundestag. As the country prepares to go to the polls, experts warn that Russian-linked campaigns, including groups like “Doppelganger” and “Storm-1516,” are leveraging artificial intelligence to spread misleading content. These campaigns have been linked to attempts to influence public opinion, with tactics ranging from creating fake TV news stories to producing deep-fake videos that fabricate sensational claims about prominent politicians. For instance, in November 2024, a video surfaced falsely accusing a parliamentary member and outspoken supporter of Ukraine of being a Russian spy. The video used AI to impersonate a former adviser making the claim. When reached for comment, the targeted politician, Dr. Marcus Faber, was unable to respond. Similarly, another AI-generated video falsely accused a German minister of child abuse, further highlighting the alarming scope of these operations.
These disinformation efforts are part of a broader strategy to shape Germany’s political landscape. Researchers from the Center for Monitoring, Analysis and Strategy (CeMAS) and Alliance 4 Europe, organizations specializing in combating disinformation, have identified patterns in these campaigns. The Doppelganger campaign, run by a Russian PR company with alleged Kremlin ties, specializes in creating fake news articles that mimic reputable publications. These stories are then amplified by networks of social media accounts, often framed as concerned citizen testimonials. For example, one post expressed worry that aid to Ukraine would divert resources from Germany’s infrastructure and social security, linking to a fake article criticizing the government’s support for Ukraine. This systematic flooding of false narratives aims to discredit mainstream parties while boosting the far-right Alternative for Germany (AfD) party.
The AfD, currently second in opinion polls, has been notably active on social media during the campaign. While the party has not explicitly endorsed the Russian disinformation campaigns, experts point to disturbing overlaps. In one instance, an AfD parliamentary member, Stephan Protschka, shared a baseless claim that the Green Party was collaborating with Ukraine to commit crimes and frame the AfD. This narrative, researchers later discovered, originated from a Russian disinformation campaign. Protschka did not respond to requests for comment, and the Russian groups behind the campaigns also remained unresponsive when approached by journalists. Ferdinand Gehringer, a cybersecurity policy adviser, notes that Russia’s interference is no surprise, given its strategic interests in weakening European unity and fostering cooperation with far-right factions like the AfD.
Beyond Russian-led disinformation, homegrown far-right groups in Germany are also exploiting advanced technologies to spread their ideologies. A striking example is Larissa Wagner, an AI-generated social media influencer who has gained attention for her far-right views. Her accounts, created within the past year, feature regular videos urging Syrian immigrants to “pack your bags and go back home” and expressing support for the banned right-wing magazine Compact. When asked about who controls her accounts, Wagner dismissed the question, framing herself as a youthful, influential voice in the political discourse. Experts like Ferdinand Gehringer note that her content has grown increasingly radical, leveraging her appeal as a young, attractive woman to engage audiences effectively.
The far-right’s use of generative AI extends beyond characters like Larissa Wagner. A recent report by the Institute for Strategic Dialogue reveals that since April 2023, over 880 far-right posts have utilized AI-generated images, memes, and music videos. A significant portion of this content originates from AfD supporters and even the party’s official accounts. In October alone, the AfD published more than 50 posts containing AI-generated material. Analysts describe the party as the most aggressive exploiter of this technology, using it to disseminate dual narratives: attacking migrants as violent criminals and glorifying traditional German values. For instance, the now-disbanded youth wing of the AfD commissioned an AI-produced song and music video advocating for the mass deportation of immigrants, a concept known as “remigration.”
The consequences of these disinformation campaigns are deeply concerning. Cathleen Berger, a senior researcher at the Bertelsmann Foundation, emphasizes that foreign disinformation alone may not shift public opinion but becomes dangerous when domestic actors amplify it. A recent survey by the foundation found that 80% of Germans view internet disinformation as a major societal problem, with 88% believing it is used to influence political opinions. As the election approaches, experts warn that the combination of foreign interference and homegrown far-right content could have a profound impact on the outcome. The situation underscores the urgent need for transparency and accountability in digital Spaces, as well as stronger measures to combat the manipulation of public discourse.