The past decade has seen a growing preoccupation, especially in Europe and North America, concerning the potential of disinformation as a tool in the hands of adversarial states. Spreading false or deceiving messages is obviously nothing new. From Sun Tzu to Machiavelli, many great theorists of strategy have advocated for their ‘weaponisation’. Throughout history, numerous states, from liberal democracies to totalitarian dictatorships, have followed their advice, deliberately spreading lies as means to achieve their strategic goals.
What has changed, however, is media. In today’s extremely fast, engaging, and mostly digital media environment, disinformation flows rapidly, and is extremely difficult to control. Democratic states increasingly recognise that their open communication regimes can be exploited by adversaries to sow discord and stimulate division. From the most simple and direct falsehoods to elaborate and convoluted conspiracy theories, disinformation appears today as one of the principal threats to democracies’ information spaces.
The Kremlin has traditionally been keen on using information as a weapon. During the Cold War, the Soviet Union carried out far-reaching ‘Active Measures’, involving the spread of disinformation and conspiracy-type accusations, hoping to weaken its Western adversaries. Under Putin, Russia is still one of the main users of disinformation and conspiratorial narratives, targeted mostly (but not solely) at Western audiences. In 2015, the Brexit referendum and US electoral campaign first brought attention to Russia’s tactics. More recently, the Covid pandemic has demonstrated how other authoritarian powers, like China, are learning from Russia’s playbook. Analysing the uses of disinformation during the ongoing invasion of Ukraine can highlight many important elements in this practice, potentially suggesting ways to combat it.
The Russian Disinformation Machine Part I: Official Broadcasters
Within the Russian context, the State employs instruments at two different levels to spread disinformation. First, at a formal, or otherwise official, one.
Since the early 2000s, the Putin-led Kremlin has invested heavily in the financing of state media agencies capable of transmitting its messages worldwide. These have been systematically put under the administration of the Federal Executive Agency Roskomnadzor. In many cases, Russia could count upon pre-existing ‘infrastructure’. TASS, the country’s leading State news agency, was established in the early 20th century, and later repurposed by the Soviets, who first expanded its reach beyond the USSR’s borders. Having seen a significant inflow of investments under Putin, TASS is today one of the four largest news agencies globally. Conversely, Russia Today and Sputnik have been established in their current form on the request of Putin and communication minister Michail Lesin. Again, the Kremlin has dedicated increasingly large resources to these initiatives: in 2020, RT’s budget exceeded €300 million.
Before the invasion of Ukraine, the reach of Russia’s ‘Instruments of Mass Information’ (SMIs) – as the broadcasters are defined – had progressively grown to encompass all five continents. Russia Today conducts broadcasts in more than 100 countries in five languages, and, before March 2022, was one among the most followed news pages on most major social networks. Likewise, Sputnik News also has regional offices spanning most of the world and conducts a 24/7 news reporting service. The recent invasion has led European states to block many Russian SMIs from most online platforms, and revoke their local broadcasting licences. Nevertheless, Russian media still maintains a strong reach in other regions of the world, especially the Global South, and even in Europe, where broadcasts are often informally re-shared on social media platforms by local supporters.
Question More: Narrative Building at Russia Today
Unlike other states, Russia (mostly) does not employ its media apparatus for self-glorification abroad. While it is true that broadcasters often seek to highlight Putin’s personality as a strong leader, and the defender of a worldwide ‘conservate’ front, this messaging is often of secondary importance. Rather, channels like RT focus on highly engaging and divisive material, with the goal of creating strife within target societies. By highlighting polarising stories, such as political scandals, protests and terrorist attacks, Russia’s official media apparatus aims to convince foreign audiences of the inherently ‘unstable’ nature of liberal democracy. An editor for RT notoriously summarised its channel’s editorial line as being ‘anything that causes chaos’.
Moreover, Russian official media also directly engages with disinformation, albeit with a degree of caution. Trying to retain a level of ‘professionality’, it (largely) avoids directly endorsing false information, rather platforming individuals who espouse conspiratorial beliefs. In general RT usually seeks to portray itself as a super partes agent, with no set agenda, but rather just seeking to allow viewers to ‘decide fore themselves’ after hearing ‘all sides of the story’. ‘Question More’, the tagline that the broadcaster has adopted, precisely entails this commitment to absolute relativism in its reporting. The conspiratorial narratives and extremist groups platformed by RT are varied, coming from both the right and left: from 9/11 ‘trutherists’, to US-based neo-Nazis, to, recently, anti-vax and anti-Global activists.
Finally, Russia deploys its SMIs in a highly sophisticated manner. Whereas the foreign official broadcasters of other states, such as China, are expected to ‘repeat’ the Party’s line to audiences abroad, Russia’s retain a higher degree of independence. Claiming to act as ‘alternative media’, Russia’s SMIs are free to cater their messaging to the political and socio-cultural background of target states, thus providing for more efficient narrative-building efforts.
The Russian Disinformation Machine Part II: Informal and Stochastic Means
In parallel to this network of formally recognised broadcasters, Russia also employs complex networks of ‘underground’ actors. Russia’s asymmetrical warfare capabilities in cyberspace have developed in parallel to those of other major superpowers. In terms of information operations, it has been a de facto ‘pioneer’.
Perhaps the most discussed elements of Russia’s underground online propaganda machine are the so-called ‘troll farms’ and ‘bot networks’. The former term refers to a system through which coordinated groups of individuals control one (or, more often, multiple) social media accounts and actively spread disinformation, or engage with foreign users in an attempt to ‘poison the well’ of political discourse. Troll accounts seek to engage in ‘real’ conversations with foreign members of the public; due to the anonymity afforded to them by online platforms, they can thus relatively easily pass as ‘concerned citizens’.
The mass-deployment of troll accounts has often been carried out through so-called ‘farms’, state-funded groups of operators that administer thousands of accounts at once. The term ‘bot’, conversely, refers to automated accounts, which often act as ‘amplifiers’ for propaganda messages, needing little human supervision. Bots are less apt to engaging in direct conversation (although AI-related improvements are making computer-led interactions increasingly ‘human’), but are cheaper and faster.
In general, Russia seeks to deploy these tools for ‘trend hijacking’: repurposing pre-existent trends for strategic aims. In terms of narrative-building, this essentially means that Russia does not seek to create disinformation- or conspiracy-related claims from scratch. Cultural and linguistic differences may in fact make such stories less likely to ‘stick’. Rather, pre-existent, nation-specific news are shared exponentially, and often paired with ad hoc conspiratorial or fake elements. In the case of conspiracy theories, this approach allows the Kremlin to hijack conspiratorial narratives and potentially re-purpose their ‘believers’ as an anti-government and anti-democracy ‘fifth column’.
Setting the Narrative during the ‘Special Military Operation’ in Ukraine
The invasion of Ukraine has seen Russia mobilise this vast network of official and unofficial actors. Since the aggression’s early hours, Russian-affiliated social media accounts began spreading disinformation, to both overwhelm adversaries’ information spaces and control the early narratives associated with the conflict. The scope and qualitative aspect of these efforts differs from earlier ‘campaigns’, as Russia’s strategic position is, today, completely dissimilar. During the Covid pandemic or foreign electoral campaigns Russian messages could engage target audiences ‘from the flank’. Remaining outside of the spotlight, Moscow could gradually seek to influence foreign discourses, while also avoiding taking an excessively confrontational posture.
The invasion has essentially removed the rationale behind this behaviour. Internationally recognised as the aggressor, Russia has adopted highly aggressive and direct communications with foreign publics. In the early stages of the invasion, Russian disinformation took a form that essentially mirrored the talking points espoused by the leadership. Echoing Putin, RT and other channels portrayed the ‘special military operation’ as a defensive effort meant to ‘de-nazify’ Ukraine and pre-empt the ‘Anglo-Saxon powers’ from starting a global war. Gradually, these basic narratives have been bolstered by full-fledged conspiratorial ones, often specifically targeting European and American audiences. As the research on EUvsDisinfo shows, Russia has sought to connect misleading narratives about Ukraine to well-established conspiracies – also with the goal of weaponizing pre-existent groups of ‘believers’.
For instance, a currently ongoing narrative argues that the Russian invasion prevented a biological attack originating from secret US-sponsored biolabs in Ukraine. Although some of its elements are new, this theory deliberately echoes previous narratives developed during the Covid-19 pandemic, as well as those espoused by QAnon. It thus seeks to re-deploy these domestic groups (QAnon and NoVax) as useful fifth columns to increase social divisions within liberal democracies.
An Italian “Z-Network” on Twitter
In the process of conducting research on Twitter for this article, a loosely connected “Z-network” of accounts was observed spreading some invasion-related conspiracy theories about the Italian government. With this (admittedly unoriginal) term, I refer to a decentralised users’ network behaving according to the ‘bot-retweeter’ model but also actively engaging in invasion-related discourses. Mostly posting in Italian, these accounts share similarities in aesthetics (including “Z-” or Russian flags-featuring usernames) and language (use of terminology/hashtags). Although attribution is uncertain, their behaviour demonstrates many key narrative-building elements.
In a first Italian-language exchange, dated March 25th, user ‘Ivanka’ (@Ivanka2424) tweeted ‘Kiev doesn’t lack electricity, supermarket shelves are full […]. What a strange war’. ‘Katjûsa’ (@unonessuno) responded ‘While here in Italy there is no, [sic] canned food, flour, seed oil’. Despite its obvious farcicality, this exchange aims to echo a broader discourse of scarcity, which recently developed in Italian politics after an increase in electric bills and fuel prices. It also fits within the broader ‘Great Reset’ conspiracy, which portrays a ‘Globalist elite’ as actively seeking to impoverish Western citizens to render them dependent on, and thus enslaved to, their governments.
A later (May 5th) post from user ‘Flaminia’ (@flayava) also follows this framework. It portrays a Ukraine-sceptic EU and absurdly claims that ‘Germany expel[led] the Ukrainian ambassador’, only to rhetorically ask ‘guess what country is governed by traitors?’. User ‘Roberta’ (@Robertavin_73) replies ‘Draghistan! [sic]’. The suffix ‘-istan’ is popular among groups promoting the ‘Great Replacement’ conspiracy, according to which EU elites are promoting migration to create a more fragmented (and thus controllable) European population. Here, therefore, claims of government corruption are again linked to an apparently unrelated conspiracy. Linking different conspiratorial beliefs into a single ‘big picture’ increases the ‘appeal’ of pro-Moscow propaganda, by relating its narratives to more ‘established’ ones.
If the invasion of Ukraine has represented a ‘new chapter’ in Russia’s use of disinformation, it has also changed European countries’ preferences in dealing with it. Prior to the invasion, the preferred counter-disinformation strategies fell into three categories. First, the most basic set of measures involved the attempt to mitigate the impact of disinformation by disproving it. ‘Fact-checking’ and ‘debunking’, often conducted through civil society partners, is meant to ‘protect’ the public from disinformation at the point of consumption. Second, ‘educative’ measures – such as digital literacy courses – have recently become increasingly popular. These hope to make citizens less likely to believe in disinformation before they consume it. Lastly, a drastic set of measures involves interdicting the information space to hostile foreign actors. This third category is considerably more drastic, and democratic states have tendentially refrained from implementing such measures at a widespread scale, also due to ethical concerns.
The invasion has essentially removed this barrier. The weeks following it have seen a sweeping implementation of restrictions targeting the producers and amplifiers of disinformation. In early March, the European Council unilaterally suspended the broadcasting activities of all channels within the Russia Today/Sputnik framework. Immediately, social media and tech companies including Google, Facebook, and Twitter also ‘geo-blocked’ RT/Sputnik in Europe. This step is particularly relevant as the removal of state-affiliated broadcasters seems to challenge previous European commitments to openness.
A recent EU Parliament Resolution (dated March 9th) hints at the fact that these measures represent a significant shift in the way the EU conceptualises media and security. The document not only states that ‘freedom of expression’ must be limited not only if it involves ‘harassment, hate speech, racial discrimination’, but also ‘terrorism, violence, espionage and threats’, thus reiterating the primacy of national security within online information spaces.
Possible Risks and Implications
The EU’s response to Russia’s attack on Ukraine may signal an increased willingness to use drastic measures to prevent adversaries from spreading disinformation. The March EU Resolution offers a framework through which further implementation of these measures may be conducted. Still, this shift may also have a negative impact on EU states. While removing troll accounts and bots from a social media platform is hardly controversial, targeting official media actors entails more issues. From a civil rights perspective, extreme care should be warranted in blocking broadcasters, even if linked to foreign powers.
Moreover, while the banning of RT and other channels will undoubtedly help to reduce the number of EU citizens directly subjected to disinformation, it is unlikely to truly address the ‘roots’ of the problem. Hardline supporters and conspiracy theory ‘believers’ are likely to employ unmonitored channels to obtain foreign-produced material, which will thus continue to be spread on social media. Adopting harsh measures to censor private citizens – even if involved in spreading fake material –obviously constitutes a violation of the EU’s values, and would severely damage European democracies’ claims to openness and freedom.
Consequently, it appears that the European Union may be hurriedly moving towards a dangerous direction. It is of course true that the very nature of open digital spaces puts would-be moderators and supervisors at a position of inherent disadvantage. However, it is also obvious that any single remedy to disinformation’s spread does not exist. Rather, multi-level approaches should be implemented in a holistic way.
European Union: European Parliament. European Parliament Resolution on Foreign Interference in all Democratic Processes in the European Union, Including Disinformation, 9 March 2022, P9_TA(2022)0064, art. 29.
EUvsDisinfo (2022). Disinfo Database: Disinformation Cases Against Ukraine. [online] Available at: https://euvsdisinfo.eu/disinformation-cases/?date=&per_page=
Lucas E. Firming Up Democracy’s Soft Underbelly: Authoritarian Influence and Media Vulnerability. Washington DC: International Forum for Democratic Studies, 2020, 8.
Silverman C. and J. Kao (2022). ‘Infamous Russian Troll Farm Appears to Be Source of Anti-Ukraine Propaganda’, ProPublica, 11 March. [online] Available at: https://www.propublica.org/article/infamous-russian-troll-farm-appears-to-be-source-of-anti-ukraine-propaganda
The (untranslated) exchanges are available under: @Ivanka2424, ‘Che strana guerra’, Twitter, March 25 2022. https://twitter.com/Ivanka2424/status/1507272237904277506; also under: @flayawa, ‘La Germania espelle l’ambasciatore Ucraino’, Twitter, May 5 2022. https://mobile.twitter.com/flayawa/status/1522121908367642628?cxt=HHwWiMC-0fTz1J8qAAAA
Walker C. and J. Ludwig. A Full-Spectrum Response to Sharp Power: The Vulnerabilities and Strengths of Open Societies. Washington DC: International Forum for Democratic Studies, 2021.
Yablokov I. and P. N Chatterje-Doody. Russia Today and Conspiracy Theories: People, Power, and Politics on RT. Abingdon NY: Routledge, 2022.
Autore dell’articolo*: Manfredi Pozzoli, BA in History & International Relations presso King’s College London, studente di Master in Diplomacy & International Governance a Sciences Po e London School of Economics.
Nota della redazione del Think Tank Trinità dei Monti
Come sempre pubblichiamo i nostri lavori per stimolare altre riflessioni, che possano portare ad integrazioni e approfondimenti.
* I contenuti e le valutazioni dell’intervento sono di esclusiva responsabilità dell’autore.
Editor’s Note – Think Tank Trinità dei Monti
As always, we publish our articles to encourage debates, and to spread knowledge and original and alternative points of view.
* The contents and the opinions of this article belong to the author(s) of this article only.