[go: up one dir, main page]

ACT Response to the EC consultation on the Election Integrity Guidelines

Download the document

Brussels, 6 March 2024. The Association of Commercial Television and Video on Demand Services in Europe (ACT) welcomes and commends the European Commission (EC) for its work on guidelines on the integrity of electoral processes. We hope that they will play an important role in preserving the electoral process in the coming years across Europe.

The below outlines the ACT’s views on the draft guidelines and makes a number of suggestions to improve them.

THE PIVOTAL ROLE OF REGULATED MEDIA

Broadcasters (TV and radio) remain the two most trusted sources of information, in stark contrast with social media (26%)[1]. They therefore play an essential in the fight against disinformation and to ensure the integrity of elections.  

Commercial broadcasters have in place robust editorial processes to ensure that fact-checked and editorially responsible information is accessible to viewers. While the guidelines refer to the role of journalists and media in gathering, processing, and reporting fact-checked information to the public (point 24), broadcasters and their journalists’ contribution to the integrity of electoral processes does not receive the attention it deserves. For instance:

  • Fact-checking: Ultimately, the first – and best – fact checkers are journalists and other editorial staff. While fact-checking organisations  may be commendable and their  role can be important, they are not substitutes to professional journalists. Furthermore, we are weary that in some situations they are not sufficiently independent from the platforms they monitor. In many cases these are directly or indirectly funded by the latter. Encouraging platform cooperation with local media organisations may achieve more tangible results than an overreliance on fact-checking organisation and EDMO hubs.
  • Authoritative content: While the guidelines suggest seeking authoritative content from election authorities, they fail to highlight the crucial role of trustworthy – and regulated – local media, as they provide contextual insights on electoral process are usually missed online

To sum up, the guidelines should put more emphasis on the important role of regulated media and their journalists in ensuring a fact-based public discourse, ensuring citizens have access to comprehensive, accurate, and locally relevant information on electoral processes. Top of Form

RISKS RELATED TO DEEP FAKES

Some of our members have alerted us on the difficulties they face in handling the overwhelming presence of deep fakes on social media. Indeed, deep fakes and their identification have made fact-checking more challenging, putting a burden on our editorial processes. To date platform actions to monitor, detect, and remove harmful AI-generated content have fallen short[2]. We’ve taken note of recent voluntary commitments by a number of actors[3]. Any action in this space – particularly with regard to the development and use of watermarking standards for deepfakes – is welcome, but previous voluntary platform initiatives, notably around disinformation, do not inspire confidence that this initiative will bear fruit.

We, therefore, welcome the AI-related measures outlined in chapter 3.3 of the draft guidelines. They could however be strengthened in the following way:

  • Access to data: Journalists, regulators, and researchers should be granted more access to anonymized social media data (e.g. via protocols or APIs)
  • Detection and watermarking technologies: Software that produces deepfakes should invest in the development of new disinformation detection technologies to counterbalance the spread of harmful deepfakes
  • Regular exchange and contact points of providers of VLOPs and VLOSEs with media service providers irrespective of their status as trusted flagger (Art 22 DSA) based on recital 61 DSA and Art 17 EMFA
  • Demonetisation: In addition to content identified as disinformation, content identified as deepfake could be demonitised by default

INTERSECTION WITH EXISTING EU RULES

While the guidelines mention the regulation on political advertising and the AI act, they overlook other relevant legislation, such as the European Media Freedom Act (EMFA), Unfair commercial Practices Directive and Audiovisual Media Services Directive (AVMSD).

As the guidelines highlight the possibility that risk mitigation measures may impact media freedom and pluralism (point 21), guidelines should refer to relevant articles 17 (protection of media content) and 18 (dialogue between media and platforms) of the EMFA. These articles are specifically designed to safeguard media freedom and pluralism online, which are essential elements of electoral processes integrity and tackling political disinformation.

We appreciate the acknowledgment that influencers have an impact on electoral choices. However, the overreliance on self-declaration is concerning given influencers’ non-compliance with existing transparency rules in commercial communications[4]. Stronger language mandating that platforms do ensure that their tools (e.g. those foreseen in the AVMSD) are consistently used in practice would be helpful.

THE LIMITS OF SELF-REGULATION

We welcome the EC’s work to move away from a purely self-regulatory model to a co-regulatory approach. We are conscious that attempts to tackle disinformation via self-regulation, such as the EU Code of Practice on Disinformation (CoP), have been largely ineffective[5]. The withdrawal of Twitter from the CoP underlines the limits of such an approach.

The future will tell if a co-regulatory approach as paved by the guidelines will bear fruit. Should this not be the case, we would encourage the EC to move swiftly with formal investigations and sanctions.


[1] According to the Flash Eurobarometer 464 Report on Fake news and disinformation online.

[2] EDMO White Paper on AI and disinformation

[3] https://www.aielectionsaccord.com/uploads/2024/02/A-Tech-Accord-to-Combat-Deceptive-Use-of-AI-in-2024-Elections.FINAL_.pdf

[4] https://ec.europa.eu/commission/presscorner/detail/en/ip_24_708

[5] EC report on the application of the DSA’s risk management framework to Russian disinformation campaigns: platforms “failed to implement (the CoP) measures at a systemic level” (p. 7 and 8).