In its most simple form, the rule of law can be defined as a way of restricting arbitrary power by subjecting everyone to democratically passed and fairly applied and enforced laws. Such laws are subject to proper procedure and judicial oversight. Rule by Decree, on the contrary, is represented by quick and unaccountable creation by laws favoured by despots.
In March 2017, I have commented on this blog, on the Commission’s attempt to regulate large platforms such as Facebook, Google and others by politely asking them to take action in the field of unfair contracts terms and removing fraud & scams. I have commented at that point that, rather than giving the Commission ex ante enforcement powers, the Treaty only gives it the right to propose laws and to occasionally partake in oversight of the ones that already exist. Where such laws do not exist, because the proper agreement lacks, they cannot and should not be replaced by letters demanding action from individual corporations.
In September 2017, the Commission extended its ambition to regulate platforms by proposing to monitor illegal content on online platforms. In that dubious document, the Commission asserted that “what is illegal offline is also illegal online” and demanded that “platforms” step up the fight against illegal content online. In summary, the document demanded that platform do more to tackle illegal content and take “proactive measures” to detect and remove such content. The Commission kept repeating that the proposed demands do not conflict with the limitations of ISP liability of Articles 12-15 of the E-Commerce Directive but did little to explain how this conflict could be resolved in practice.1 Further to that, different kinds of “illegal” content were all bundled under one title thus putting such issues as child pornography, hate speech, terrorism, commercial fraud and copyright infringement all under one hat. Already at that point, the Commission had been warned that its position is contradictory and dangerous for free speech.
Good laws cannot be replaced by decrees.
Having learned little from the criticism, the Commission marched on and issued a set of operational measures (in the form of Recommendation) leaning on the September 2017 Communication and targeting “companies and Member States” and threatening legislation if these fail to be effective. In it, the Commission calls for
- Clearer ‘notice and action’ procedures
- More efficient tools and proactive technologies
- Stronger safeguards to ensure fundamental rights, in particular when automated tools are used for removal
- Special attention to small companies
- Closer cooperation with authorities
Special rules are introduced for “terrorist” content online and this inlcudes
- the obligation to remove such content within an hour of referral
- faster detection and removal
- better referral
- regular reporting to the Commission, in particular about referrals
While the criticism concerning the September 2017 communication still stands, new concerns have to be added. They can be summarised as follows:
- The Guidelines lack effective safeguards against abuse. Points 1.11 and 1.12 require only that content providers be given the right to contest the decision “within a reasonable time period“ and through a “user-friendly” mechanism. Point 18 encourages “proportionate and specific proactive measures” in tackling illegal content in particular by automated means. Points 19 and 20 call for “effective and appropriate safeguards” but, essentially, leave this issue to Member States, further increasing disparity between them and fostering confusion. Point 21 unhelpfully and vaguely calls for protection against abuse without being specific on what this protection might constitute in.
- The Guidelines are in direct conflict with Article 15 of the E-Commerce Directive. That article explicitly states that Member States shall not impose a general obligation to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity. While it is perfectly possible and allowed, in a civil and democratic society, to repeal this article, this has not happened yet.
- The Guidelines bundle different kinds of illegal content. Illegal content is defined in point 1.4.b as “any information which is not in compliance with Union law or the law of a Member State concerned”. The definition is unacceptably wide. Furthermore, it conflates content which is illegal at EU level (because of harmonisation efforts) with material that is illegal only in some states which, in turn, also brings subsidiarity into play (for, why would Member States transfer issues concerning locally-illegal content onto EU). Further to that, copyright, hate speech, child pornography and terrorist content require vastly different monitoring and enforcement tools. It makes no sense to treat them as one.
- Finally, and most importantly, good laws cannot be replaced by decrees. The Commission understands all too well that an effort to pass legislation on hate speech alone (never mind anything else covered by the Recommendation) would require lengthy public consultations, acrimonious disputes between stakeholders and endless negotiations as texts progress through various committees in the effort to find a compromise between the Parliament and the Council. Whereas soft law is a legitimate and useful governance tool, it is defined by its non-binding character.2 A recommendation that threatens legislation unless it reaches its desired effect is not soft law, it is a decree. We should remind the Commission that the reason why EU Internet laws have been a success3 is precisely because democratic procedures have been respected. It may be tempting to make shortcuts but this undermines the legitimacy of EU institutions and will ultimately do little to combat illegal content.
- In fact, it got into trouble for exactly the same reasons with another badly-formulated proposal – Copyright in the Digital Single Market. ↩
- As evidenced, among others, in Article 288 TFEU. ↩
- The E-Commerce and InfoSoc directives have lasted over 17 years while the Data Protection Directive is over 20 years old. ↩