European Parliament
The European Parliament building in Strasbourg, France. Image: European Parliament / Flickr

EU sets rules for Big Tech to tackle interference in European Parliament elections

The European Commission has published a range of new rules for the largest technology platforms to abide by, ahead of the European Parliament elections in June.

The guidelines under the Digital Services Act only apply to the very largest platforms and search engines — those with more than 45 million active users in the bloc — and come with fines of up to 6% of the company’s global turnover if the platform is judged to be flouting them.

Officials in the European Union are concerned that, with a war in Europe, the elections will be targeted by Russia in support of its own policy objectives, particularly undermining support for Ukraine.

The elections will see voters in EU member states elect a total of 705 Members of the European Parliament (MEP), although these individuals have much less power than legislators in other systems. The European Commission — composed of officials nominated by each member state — effectively forms the cabinet government.

European elections themselves tend not to attract the same levels of attention as domestic ballots. Turnout in the 2019 European Parliament election was the highest for more than 20 years, at just over 50% — but still significantly below the 73% for the French presidential election in 2022, or the 76% in the German federal election in 2021.

Despite this, the integrity of the vote is a major issue amid concerns about growing far-right nationalism in Europe, which has been linked to Russian support, both in terms of directly receiving funding, and in hack-and-leak operations.

Some analysts have predicted a significant shift to the right in many countries during this year’s election, with populist and Eurosceptic parties gaining votes and seats across the EU amid concerns over high inflation rates and the increasing cost of living.

To ensure that such a shift is the authentic will of the people, rather than the product of foreign information manipulation interference, the European Union’s new guidelines oblige the largest technology platforms to set up “internal teams with adequate resources” to monitor for interference risks “before, during and after elections.” 

Transparency for users

The companies will also have to maintain a publicly available, searchable repository of political advertisements “updated in as close as possible to real-time” that will offer third parties the ability to identify who had been targeted by what particular content.

According to the guidelines, the platforms will be required to “promote official information on electoral processes […] and adapt their recommender systems to empower users and reduce the monetisation and virality of content that threatens the integrity of electoral processes.”

Generative AI is specifically cited by the European Commission, with the platforms asked to take heed of the bloc’s recent AI Act and attempt to inform their users when engaging with AI generated content.

One of the major features of the bloc’s recently approved AI Act is a provision encouraging the developers of such software to watermark it, allowing the very large platforms covered by the Digital Services Act to notify users when they’re engaging with a fake.

Deceptive media is regularly used to smear politicians. Late last year, an audio clip posted to social media on Sunday, purporting to show Britain’s opposition leader Keir Starmer verbally abusing his staff, was debunked as being AI-generated by private-sector and British government analysis.

Similar material was used in an attempt to sway elections in Slovakia, where, two days before the polls opened there on September 30, faked audio clips were published on social media attempting to incriminate an opposition party leader and a journalist with rigging the election by plotting to purchase votes.

Publicly debunking the audio was a challenge because of the country's election laws, which strictly ban both the media and politicians making campaigning announcements in the two days before the polls open.

As reported by Wired, as an audio post the fake also “exploited a loophole in Meta’s manipulated-media policy, which dictates only faked videos — where a person has been edited to say words they never said — go against its rules.”

It is not clear who produced the fake audio in either the Slovakian or British cases. Widespread access to generative AI software makes it trivial for non-state actors and trolls to create false and viral content.

Margrethe Vestager, the EU’s executive vice-president for a Europe Fit for the Digital Age, said: “We adopted the Digital Services Act to make sure technologies serve people, and the societies that we live in. Ahead of crucial European elections, this includes obligations for platforms to protect users from risks related to electoral processes – like manipulation, or disinformation. Today’s guidelines provide concrete recommendations for platforms to put this obligation into practice.”

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles

Alexander Martin

Alexander Martin

is the UK Editor for Recorded Future News. He was previously a technology reporter for Sky News and is also a fellow at the European Cyber Conflict Research Initiative.