
A Fundamental Threat to Democracy: When the Electorate’s Agency Becomes a Target

Elections shape the country's course and give the essential legitimacy to a democratically elected government. However, the last decade has shown that elections, this crucial component of democracy, can become a period of vulnerability, in which domestic and external actors leverage their resources with tangible effects. In 2026, Armenia, Bulgaria, Hungary, and Latvia will hold parliamentary elections, with general elections scheduled in Denmark and Sweden.
A malign actor can utilise three different kinds of threats in its influence efforts, of which the first targets the conduct of elections through manipulation or sabotage of the electoral process.[1] But since the targeted country may implement protective measures, a threat actor can undermine the confidence in its population, which is more easily accessible and harder to defend.[2] Therefore, the second threat is the erosion of trust in the conduct of elections through information manipulation; it exploits the vulnerabilities created by physical attacks on the electoral process. Finally, the electorate itself can become a target, as broad participation is essential for the general public to accept the results. The third threat thus implies targeting the will and the ability to vote to reduce turnout and disenfranchise certain segments of the electorate. This brief explores the third type of threat in detail.
The threat implies targeting the will and the ability to vote to reduce turnout and disenfranchise certain segments of the electorate
Two Threat Vectors
A report on Foreign Information Manipulation and Interference (FIMI) by the European External Action Service (EEAS) recognises this threat by distinguishing between voluntary and involuntary abstention from elections.[3] In this logic, the threat actor promotes voluntary abstention by targeting the will to vote—the first threat vector. It interferes by amplifying demobilising narratives that evoke emotions like hopelessness and cynicism or lead to disengagement (e.g., ‘voting changes nothing’, ‘all parties are corrupt’, etc.).[4] The EEAS specifically highlights the act of refusing to vote or invalidating one’s vote as a gesture of protest, with particular narratives that promote such behaviour as an identity or a lifestyle (e.g., ‘stay home in protest’).[5] This threat vector was visible in Italy during the 2024 elections to the European Parliament, when viral hashtags #iononvoto (I don’t vote) and #iorestoacasa (I stay at home) appeared on X, while several campaigns encouraged active invalidation of votes.[6]
Via the second vector, non-participation is an involuntary reaction, as a voter’s capacity to act is impaired. This can be achieved by disinforming voters about the terms and requirements of the process or by a physical disruption. For instance, false security alerts near polling stations can spread panic and suppress turnout. The Russian Internet Research Agency (IRA) tried and tested this threat vector, among other means, during the 2016 presidential elections in the US: it targeted the Spanish-speaking population, encouraging them to vote by SMS.[7] Such interference was facilitated by factors like the limited availability of voting material in the Spanish language and the popularity of private messaging apps in the targeted communities, enabling manipulated information to spread with speed and reach.[8] In a similar campaign ahead of the Election Day in 2020, amidst the COVID-19 pandemic, automated mass calls urged more than 800 000 voters to “stay safe and stay home.”
A Complementary Strategy
In recent information operations, such as Doppelgänger or Storm-1516, a common interference method is to promote emotionally-charged narratives, primarily focused on existing resentments towards minorities or cultural issues.[9] While those are typically used to mobilise voters in favour of a political force, they can also help achieve an opposite goal in the context of a hybrid operation. For an effective interference campaign, cyber and kinetic attacks, together with information manipulation, can be used separately or in combination, building on each other to enhance reach and severity.[10] Similarly, the threats themselves can be combined to increase their effectiveness and to (dis-)engage different segments of the population.
The threats can be combined to increase their effectiveness and to (dis-)engage different segments of the population
The threat vectors could be used to microtarget the voter base of a specific party or political ideology deemed less susceptible to ‘engaging’ narratives, leading instead to their disenfranchisement from the electoral process. Such interference is enabled by the current ‘engagement-rewarding’ information environment, which allows threat actors to test, adapt, and deliver specific content at scale and with granular audience targeting, while its lifestyle-adjacent manner disguises the political nature.[11] In addition to the degradation of the voter base for the benefit of a desired candidate or political force, the threat vectors could also be used to undermine the strength of major parties and further a scenario of coalition government, increasing the difficulty of formulating policy and decision-making processes. Through a broader attack achieved by calibrating targeting intensity, the overall turnout can be suppressed to the benefit of smaller, more radical parties with destabilising effects on the political system. These two options represent only extremes on a spectrum of actions, which, like any hybrid threat, can be tailored to each targeted society individually, aiming to capitalise on its unique combination of vulnerabilities.[12]
Although the two threat vectors (dis)engage their targets through different means, they can be considered as mutually reinforcing.[13] In this feedback loop, it is visible that emotionalising narratives that evoke and amplify fear or frustration, such as ‘they’ll discard your ballot anyway’ or ‘it’s pointless to go’, also increase the vulnerability to misleading procedural claims. This could be explained by the erosion of trust in political institutions, inducing the public to consult ‘alternative’, mostly unregulated information sources, where they are, in turn, more likely to encounter such claims.[14] Procedural confusion (presenting itself in sentiments like ‘the system is too complicated’) can also be emotionally frustrating, leading to conclusions like ‘they don’t want my vote’.
Long-term Effects and Mitigation
Since the two threat vectors target fundamental requirements of elections and erode the agency inside the electorate, a successful interference can have severe consequences for the targeted country.[15] Here, the EEAS identifies the risk of alienating segments of the population who refuse to recognise the election result as legitimate, which can trigger protests and violence. Such campaigns can also disrupt the trust-based connections between the state and its citizenry, demotivating interaction and strengthening anti-state sentiments.[16] This combination increases a target’s vulnerability while simultaneously complicating the state’s efforts to restore trust.
Europe must approach digital interference as an existential threat to democracy rather than a problem of platform management
When targeting elections, a threat actor can enable and expand capabilities by advancing its foothold in the targeted society and leveraging assets in various ways, from espionage to kinetic operations such as sabotaging critical infrastructure. While the hybrid war Russia wages against Europe has been predominantly visible in air incursions and the destruction of undersea cables, its subversive and less noticeable actions are no less impactful. To weaken resolve and disrupt decision-making, Russia undermines democratic integrity by interfering in elections and indirectly supporting political actors through FIMI campaigns that fragment the population. Europe must approach digital interference as an existential threat to democracy rather than a problem of platform management, and rigorously enforce the Digital Service Act (DSA).[17] For example, Telegram has not been designated as a Very Large Online Platform (VLOP) under the DSA, resulting in less scrutiny; meanwhile, manipulated information flourishes, and the recruitment and coordination of Russian assets continues.[18]
It is essential to recognise the potential of electoral interference, and even attempts of it, as part of a broader threat landscape, and to adapt responses to its increased scale and severity. A holistic approach to hybrid threats must aim at overcoming the silos that separate perceptions of FIMI, cyberattacks, hybrid operations, and kinetic actions.[19] This idea is grounded in the understanding that the EU’s adversaries possess an operational advantage, as they can adapt their cross-domain threats to exploit Europe’s blind spots.
In the case of the two threat vectors, the provision of multilingual information channels for voters can pre-emptively reduce the effectiveness of targeting the ability to vote.[20] Whereas reliance on the informational domain gives a threat actor the advantage of low cost and high scalability, it can also prove to be a major weakness. With the feedback loop in mind, measures to address information manipulation can be adapted to counter malign narratives and the emotions they elicit. For example, the prebunking technique, a newer tool in the fight against FIMI, introduces campaign-specific, truth-based information to the targeted population, increasing resilience against manipulation.[21] To utilise this ‘priming effect’, it is critical to identify a FIMI campaign in its initial phase, before it gains traction, which is achieved via continuous improvement of early warning systems and monitoring capabilities, as well as the intelligence sharing.[22] Prebunking saves resources at a later stage, otherwise needed for debunking and fact-checking. As it has become evident in recent years, the resources for such efforts are clearly overstretched by the sheer volume of information manipulation, especially during high-stakes periods such as elections.
To strengthen media literacy and resilience in the population, the EU developed the educational game Vote for Turtle.[23] Building upon the prebunking-like technique, it allows the user to experience an information manipulation campaign from the threat actor’s perspective in a simplified, short, and humorous manner, which reinforces key aspects through repeated exposure. This practical, interactive experience enables users to familiarise themselves with the processes and warning signs. Educational tools like this could be introduced as early as the school level, providing the future electorate with the best possible conditions to learn to search and access reliable information, an ability now more crucial than ever.
[1] Sebastian Bay, Countering Hybrid Threats to Elections: From Updating Legislation to Establishing Collaboration Networks, Hybrid CoE Research Report no. 12 (The European Centre of Excellence for Countering Hybrid Threats, 2024).
[2] Sebastian Bay et al., Hot mot svenska allmänna val. Exempel och scenarier för valadministrationen [Threats to Swedish general elections. Examples and scenarios for election administration], FOI-R– 5298–SE (Swedish Defence Research Agency, 2022).
[3] European External Action Service, 2nd EEAS Report on Foreign Information Manipulation and Interference Threats: A Framework for Networked Defence (European External Action Service, 2024).
[4] Sebastian Bay, email to author, 7 November 2025.
[5] Sebastian Bay, email to author.
[6] Foreign Information Manipulation and Interference – Information Sharing and Analysis Center, FIMI-ISAC Collective Findings I: Elections (FIMI-ISAC, 2024); Cynthia Kroet and Romane Armangau, “Who Wants to Stop People Voting in European Elections, and Why?,” Euronews, 6 June 2024.
[7] Sebastian Bay, email.
[8] William T. Adler and Dhanaraj Thakur, A Lie Can Travel: Election Disinformation in the United States, Brazil, and France (Centre for Democracy & Technology; Konrad-Adenauer Foundation, 2021).
[9] For detailed information about the campaigns, see: Bavarian Office for the Protection of the Constitution, „Doppelgänger“ Interne Details zu Russischer Desinformationskampagne. Teil 2 – Vollanalyse [“Doppelgänger” Internal Details of Russian Disinformation Campaign. Part 2 – Full Analysis], (Bavarian Office for the Protection of the Constitution, 2024); James Pamment and Darejan Tsurtsumia, Beyond Operation Doppelgänger: A Capability Assessment of the Social Design Agency (Psychological Defence Research Institute at Lund University, 2025); VIGINUM, Analysis of the Russian Information Manipulation Set Storm-1516 (VIGINUM, 2025).
[10] Sebastian Bay, Countering Hybrid Threats to Elections: From Updating Legislation to Establishing Collaboration Networks (Hybrid CoE, March 2024).
[11] Sebastian Bay, email.
[12] Mikael Wigell, “Hybrid Interference as a Wedge Strategy: A Theory of External Interference in Liberal Democracy,” International Affairs 95, no. 2 (2019): 255–75.
[13] Sebastian Bay, email.
[14] W. Lance Bennett and Steven Livingston, “The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions,” European Journal of Communication 33, no. 2 (2018): 122–39.
[15] Bay et al., Hot mot svenska allmänna val; Sebastian Bay, email.
[16] Eri Bertsou, “Rethinking Political Distrust,” European Political Science Review 11, no. 2 (2019): 213–30.
[17] Vassilis Ntousas and Etienne Soula, Europe’s Moment of Truth. A Democracy Shield for Today and Tomorrow (Alliance for Securing Democracy at the German Marshall Fund of the United States, 2025).
[18] Kacper Rekawek et al., Russia’s Crime-Terror Nexus. Criminality as a Tool of Hybrid Warfare in Europe, (ed) Dominika Hajdu (GLOBSEC, International Centre for Counter-Terrorism, 2025).
[19] Ntousas and Soula, Europe’s Moment of Truth.
[20] Sebastian Bay, email to author
[21] Roman Shutov, Prebunking in Practice, Presentation at the Disinfo2025 conference (International Media Support, EU Disinfo Lab, 2025).
[22] Ntousas and Soula, Europe’s Moment of Truth.
[23] European Union, “Vote for Turtle,” accessed in February 2026.






