Follow-up: Fundamental Rights Concerns Regarding the Digital Services Act Emerge
1 June 2020
Authors: Liisa Vaaraniemi, Anton Pirinen and Jesper Nevalainen
In March, we provided our initial insights on the European Commission’s initiative to update the rules for digital services in the EU, i.e. the Digital Services Act (the “DSA”). Interestingly, on 27 April 2020, the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs (the “LIBE Committee”) published a Draft Report on the Digital Services Act and Fundamental Rights Issues Posed (the “Draft Report”). The Draft Report has been prepared on the Parliament’s own initiative, meaning that it is not a part of the formal decision-making procedure of the EU. Based on the Draft Report, it appears that the Parliament supports the Commission in its endeavour to update and harmonise the EU legal framework governing digital services. However, the Draft Report aims to highlight the elements to be taken into account in the DSA from a fundamental rights perspective.
A Balanced Duty of Care
A commonly raised concern regarding responsibility of online platforms is that content monitoring obligations place an unreasonable cost and resource burden on the platforms. In this regard, the Draft Report proposes a balanced duty-of-care approach.
According to the Draft Report, online platforms actively hosting or moderating content should bear more responsibility for the safety of the online environment. While it is emphasised that more responsibility should not equal a general monitoring obligation, a certain level of duty of care for platforms regarding illegal content on their services is nevertheless proposed. This approach is referred to as a “balanced duty of care”, which appears to include rules on notice-and-action mechanisms and requirements to take proactive measures, but only to the extent that such measures are proportionate to their scale of reach and to the platforms’ operational capacities. It is further recommended that the platforms’ current limited liability for content under the E-Commerce Directive 2000/31/EC be preserved to avoid over-regulation and the possibly resulting over-compliance.
Guidelines on Harmful Content
Another concern that is often encountered in connection with platform liability is the lack of a clear distinction between legal, illegal, and harmful user-generated content. Deciding whether certain content should be removed is often challenging and requires legal expertise, and at the same time, online platforms may be held liable for making the wrong decision. In this regard, the Draft Report proposes guidelines.
The Draft Report seems to acknowledge that a key issue related to online platforms as liable content moderators is how the liability affects the removal of online content and thereby the freedom of expression. The permissibility of content often depends on the context, which must be taken into account in all removal decisions. Further, if the contemplated responsibility for harmful content takes effect, contextual assessment will become even more important. In this regard, according to the Draft Report, some automated tools are not sophisticated enough to take the context into account. It is also underlined that the interpretation of the law should not be delegated to private companies. Guidelines are proposed to ensure that content is only removed and blocked to the extent necessary.
Availability of Judicial Redress
It has also been considered problematic that the restrictions on the freedom of expression that may result from the removal of user-generated content take place in an environment that lacks effective judicial remedies. The Draft Report stresses the need for accessible judicial redress to ensure that individuals whose speech is restricted are not left empty-handed.
The Draft Report appears to acknowledge that illegal content online should not only be removed by online platforms, but the removal should also be followed up by law enforcement and the judiciary. The Draft Report stresses the importance of cooperation between online platforms and authorities. Further, the need for appropriate safeguards and due process obligations, including human oversight and verification, as well as counter notice procedures are recognised. The aim is to ensure that the removal or blocking of content is made accurately, on right grounds, and with respect to fundamental rights.
A New, Independent EU Body
In order to ensure oversight and compliance of the new rules, an independent EU body is suggested. This EU body would have the power to impose fines or other corrective actions on online platforms.
The Draft Report supports the creation of an independent EU body that would monitor compliance with the new rules. The EU body would enforce procedural safeguards and transparency and provide guidance on harmful content. Although concerns have been expressed that various possible negative consequences may constitute an indirect incentive for online platforms to remove content rather than keep it available, it is proposed that the EU body have the power to impose fines and other corrective actions on platforms that fail to provide sufficient information on their procedures or algorithms in a timely manner.
Final Remarks
The fact that there appears to be some inconsistencies in the Draft Report gives a good idea of how challenging the initiative to update the EU's digital services rules really is. On the one hand, the safety of the online environment is a top priority, but on the other hand, it is important to ensure that safety is not achieved at the cost of fundamental rights. Thus, as expected and highlighted by the EU Parliament’s involvement in the matter outside the formal decision-making procedures, the update and harmonisation of the EU legal framework governing digital services through the DSA appears to be a challenging balancing exercise, which may not be solved quickly. We will keep you updated.