Out-Law / Die wichtigsten Infos des Tages

The rise of populist governments, Covid-19 sceptics, climate change deniers, Russia-Ukraine war propogandists, and malevolent actors, coupled with a changing media market, has increased the risk of disinformation gaining traction online.

An EU study found that more than half of Europeans believe they have been exposed to disinformation online, while 83% think it poses a risk to democracy.

In 2018, a self-regulatory code of practice on disinformation was developed to address the problem. Many online intermediaries have since signed up to the voluntary code, though there has not been a legal imperative to do so – until now.

The risk of disinformation has grown, and the code of practice was strengthened this year to account for this. Though signing up to the code’s commitments is still voluntary, the Digital Services Act – now approved by EU law makers – provides fresh incentives for online platforms to do so.

Disinformation and the Digital Services Act

The Digital Services Act sets out a range of obligations for online intermediaries in respect of their role of connecting consumers with goods, services, and content. The most stringent requirements are reserved for ‘very large online platforms’. Among other things, they are obliged to identify, analyse, assess and mitigate ‘systemic risks’ arising from the design, functioning and use made of their services in the EU.

Those obligations cover risks that arise not just from the dissemination of illegal content, but from other content that is deemed harmful too. In this regard, a recital to the Act specifically encourages very large online platforms to “pay particular attention on how their services are used to disseminate or amplify misleading or deceptive content, including disinformation”.

The DSA cites scope for codes of conduct to support “proper application” of the Act. The strengthened code of practice on disinformation is an example of such a code. Though signing up to an affiliated code is voluntary, policymakers have made clear the potential implications for platforms that fail to sign up to the codes or comply with the code commitments after doing so.

In June 2022, after the strengthened code of practice on disinformation was announced, Thierry Breton, EU commissioner for the internal market, said that “very large platforms that repeatedly break the code and do not carry out risk mitigation measures properly risk fines of up to 6% of their global turnover”.

Breton’s comments are not an empty threat. The DSA provides scope for the ‘Digital Service Coordinator’ in each member state, with large investigation and enforcement powers, and the European Commission, to intervene and require action where there has been a “systematic failure to comply” with the codes of conduct. In the case of very large online platforms, the Commission will have the power to impose fines of up to 6% of their annual global turnover for the most serious breaches of the Act.

The good news for platforms is that the DSA also makes clear that signing up to the code commitments should be considered a possible risk mitigation measure, of the kind referred to be Breton, and this is also mentioned in the code itself.

The code of practice on disinformation

The 2022 code of practice on disinformation contains 44 commitments and 127 specific measures that signatories can adopt.

One of the core themes that runs through the commitments is the demonetisation of disinformation, including by cutting financial incentives to spread disinformation by ensuring that purveyors of disinformation across the advertising supply chain do not benefit from advertising revenues.

The development of a consistent understanding of and approach to political advertising is also envisaged under the code, with measures promoting better labelling of and verification schemes for political advertising. Other measures are promote media literacy and critical thinking among users.

Platforms are also encouraged to sign up to adopting “safe design practices”, such as the use of recommender systems and pre-testing, to minimise the risk that disinformation will spread on their service. The code further seeks to ensure users are empowered to identify and report disinformation online.

The code also envisages platforms agreeing on “a cross-service understanding of manipulative behaviours, actors and practices not permitted on their services” to limit such behaviours. It lists the creation and use of fake accounts, bot-driven amplification, impersonation, the use of malicious deep fakes, and non-transparent paid promotion by influencers as among the behaviours that would qualify as manipulative.

Other commitments and measures are aimed at boosting cooperation between platforms and other stakeholders with an interest in curbing disinformation. This includes measures promoting data sharing with researchers, the establishment of collaboration agreements with fact-checking bodies, and information sharing with the code’s permanent task-force – a body chaired by the European Commission and made up of a group of signatories, the European Regulators' Group for Audiovisual Media Services, the European Digital Media Observatory and the European External Action Service.

The code provides for regular reporting to the Commission of the measures signatories implement to meet the commitments they have signed up to under the code. Very large online platforms are expected to report every six months on the implementation of their commitments under the Code and will be audited. Other signatories should report on a yearly basis.

The code also promotes transparency in relation to policies and the implementation of measures to combat disinformation, including in relation to the appeal mechanism open to users affected by decisions made regarding their content.

The code also contains commitments and measures designed to support with the monitoring of the code’s impact. It envisages signatories working together to develop structural indicators to support with this

Current signatories to the code include major technology companies Google, Meta, Microsoft and Twitter, as well as other social media platforms such as TikTok. Smaller platforms, fact-checkers, civil society groups and advertising industry bodies are also among the signatories. The European Commission issued a call for others to sign up to the code in July.

Actions for platforms

Now the DSA has been finalised, it is likely to become EU law within weeks. Out-Law anticipates that the DSA will enter into force sometime in mid-November. It will begin to apply directly and in full across EU member states 15 months after it enters into force.

However, some provisions impacting platforms will have effect from the point the DSA enters into force. These include a series of disclosure obligations that are designed to help the European Commission determine whether platforms should be designated as ‘very large online platforms’ (VLOPs) for the purposes of the DSA.

It seems likely that the Commission will designate platforms as VLOPs during 2023. The DSA makes provision for a platform to become subject to the VLOP rules four months after they have been designated as a VLOP by the Commission. This means it is possible that some VLOPs could be subject to the stiff requirements of the DSA, including the provisions concerning disinformation, before the 15-month transition period for the DSA expires, and all its provisions kick-in.

Whether a platform is a VLOP or not, it is clear that tackling disinformation is high on the European Commission’s agenda. There was evidence of this at a recent conference hosted by the Central European Digital Media Observatory where Krisztina Stump, who heads up the directorate-general for communications networks, content and technology, set out the Commission’s view of the code of practice on disinformation. The prospect of new legislation underpinning action on tackling disinformation, and the stiff sanctions regime envisaged, should incentivise platforms to review the strengthened code and take steps to ensure they comply with the commitments in it.

Written by Aurélie Caillard of Pinsent Masons in Luxembourg.

We are working towards submitting your application. Thank you for your patience. An unknown error occurred, please input and try again.