A man reads Wikipedia

Researchers: Wikipedia a front in Russian propaganda war

New research examining pro-Kremlin edits made to the English-language page for the Russo-Ukrainian war has shed light on how Wikipedia can be manipulated for information warfare.

The analysis, from the Institute for Strategic Dialogue (ISD) and the Centre of the Analysis of Social Media (CSAM) in the U.K., could help develop systems that can detect platform manipulation on one of the world’s most popular websites.

Published on Monday, the study detailed suspicious edits of the Russo-Ukrainian page on the digital encyclopedia, alongside what it describes as a methodology for identifying potentially coordinated manipulation efforts targeting Wikipedia in general.

Carl Miller, one of the paper's authors, stressed that the research wasn't a smoking gun for state-linked manipulation. "We haven't attributed suspicious editing activity directly to the Russian state. We were never going to be able to do that,” he said. “The idea was to try to characterize behavior already known to be suspicious to see whether — in time — we can use this as signals for undetected suspicious activity."

The ISD/CSAM analysis focused on 86 accounts that had been editing the Russo-Ukrainian war page, which were subsequently banned by Wikipedia for violating its rules, including for being operated as “sock-puppet” accounts used to disguise who their real operator was.

Although the research "did not set out to identify unknown suspicious activity, but rather to understand and describe activity that was already known to be suspicious," the manipulation of Wikipedia they detailed has not been widely reported and suggests the platform could potentially be exploited similarly to social media platforms.

The editors the analysis focused on were behind 681 edits to the page. Miller said the researchers identified 16 links to state-sponsored media introduced by those edits, and acknowledged this was "probably not the best way of identifying suspicious edits." So the researchers then began to analyze each of the edits manually.

The manual analysis found the edits exhibited “narratives consistent with Kremlin-sponsored information warfare” and cast doubt on the objectivity of pro-Western narratives while maximizing the objectivity of pro-Kremlin accounts.

The edited material also supported Kremlin descriptions of ongoing situations, including introducing historical narratives about the ousting of former Ukrainian president Viktor Yanukovych — an incident preceded by his refusal to sign an association agreement with the European Union and then shortly followed by Russia’s military annexation of Crimea.

Other edits added “Kremlin quotations and press releases explicitly into the page to increase the salience of pro-Russian arguments and viewpoints,” the authors wrote.

The researchers said their aim was to examine the ways in which Wikipedia could be vulnerable to the same kinds of manipulation that have targeted other social media platforms, including Facebook, Twitter, YouTube and Reddit.

“Wikipedia has been famously resilient to vandalism. All edits are open, vandalism can be rolled back quickly, pages can be locked and protected, and the site is patrolled by a combination of bots and editors,” the study states.

Describing itself as “a short contribution to the discussions around the threats posed by information warfare” the paper says one question remains: “How vulnerable is Wikipedia to information warfare which might use subtler methodologies and be executed over longer lengths of time?”

While the U.S. government accused Russian organizations of engaging in “information warfare” by operating social media campaigns to inflame division and muddy public understanding of news events, a declassified Intelligence Community Assessment into that interference during the 2016 presidential election did not identify any such activities on Wikipedia.

For the general public, Wikipedia competes with the most respected news organizations in the world as a trusted source of information, despite being constructed by volunteers who are required to link to source material — often provided by those same news organizations.

It is regularly targeted by reputation management companies and other forms of malicious editing behavior designed to promote particular views and, in September 2021, the Wikimedia foundation took action against 19 users, 7 of whom were banned. In an announcement, the organization linked the action to concerns about “infiltration” from mainland China.

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles
Alexander Martin

Alexander Martin

is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.