The European Electronic Communications Code – Where are We Now, Where Are We Going?

After a lengthy process lasting over two years, the new European Electronic Communications Code (EECC) had been adopted on December 18, 2018. The Directive proposing the EECC had been first published in 2016 as part of wider attempts to reform EU digital laws on the content and carrier layers, which had been promised in the 2015 Digital Single Market Strategy. The EECC is now in force and would need to be implemented by the end of 2020. This short post is an attempt to summarise the biggest changes that the new regulatory framework brings and to highlight some of its weaknesses.

The 2015 DSM Strategy, analysing the need for improvements in telecoms, points out that the sector suffers from “isolated national markets, a lack of regulatory consistency and predictability across the EU, particularly for radio spectrum, and lack of sufficient investment notably in rural areas”. In order to remedy the situation, and in particular “deliver access to high-performance fixed and wireless broadband infrastructure“, reform was needed. A casual reader of the less-than-2-page-long part of the Strategy dedicated to telecoms would be left confused. Other than a call for a better spectrum policy and more investment in high-speed networks (both of which had also been repeatedly called for in earlier papers and are, therefore, not particularly new), such a reader would not be able to see whether the present EU regulatory framework is functional and what the EU’s position is in comparison to other developed economies. To that reader, the fact that EU lags behind rivals in high-speed broadband deployment and take-up and is also behind on 5G development would not be apparent.

In order to understand the importance of the EECC and see if it can answer these challenges, it is necessary to give an overview of the most important features of the (still applicable) 2009 regulatory framework. Three elements, in particular, are of notice:

  • First, the EU regulatory framework is based on competition law principles (significant market power, potential abuse, remedies, etc.) but is, in reality, a separate (sector-specific) system of rules which applies in parallel with regular competition rules. While the ultimate (declared) aim is for competition law only to apply, the EU is not at that stage yet. The main reason why sector-specific rules are needed is that competition law remedies problems ex post, after they arise, while intervention is needed ex ante, before distortions appear.
  • Second, the main regulatory method is ex ante application of remedies to market actors with significant market power (SMP). This type of regulation is asymmetric by definition as it applies only to SMP undertakings. In practice, the regulatory effort had concentrated on the incumbents – former state-owned telecoms companies which have been opened up for competition in the 80s.
  • Third, the EU approach is primarily a service-based competition model where the regulator encourages entrants to offer competitive services through access imposed through ex ante regulation on the incumbents. Contrary to that, in an infrastructure-based competition model, competitors are incentivised to build their own infrastructure.

The 2009 regulatory framework is largely based on 2002 one, with significant elements dating from even earlier laws. Since it is important to understand the extent to which EECC brings novelties, the following can be stated:

  • The EECC is a codification measure which puts the four main directives making the 2009 framework (Framework, Authorisation, Access and Universal Service directives)1 from the 2002 and 2009 frameworks into one package.2 While this potentially removes the confusion which arises from many amendments which have accumulated over the years, it does not bring an entirely new text.
  • The EECC is fundamentally based on the same ideas and principles the 2002 and 2009 frameworks are based on. The changes are frequently minor and often only cosmetic and the articles have largely been replicated. More importantly, still, the fundamental regulatory ideas are the same as those in the previous frameworks. The EECC is fundamentally still a sector-specific, ex ante and asymmetric system which targets enterprises with significant market power. Although the number of regulated markets has gradually been reduced over the years to the effect that only the wholesale side is properly regulated, full competition does not exist.

Although one may wish to debate the prudence of extending the present regulatory model to future telecoms, it may be worth stating that not all changes are purely cosmetic. Other than the codification/simplification, the EECC introduces some changes which are geared towards improving investment.

  • In particular, attempts to encourage investment in next-generation networks have been made. The measures essentially attempt to exempt enterprises from SMP regulatory regime in cases where they commit to building new networks. The co-investment provision of Article 76 allows NRAs to accepts commitments from undertakings wishing to allow co-investment (through co-ownership, risk sharing or purchase agreements).
  • Significant attempts have been made to harmonise and manage radio spectrum (Articles 28, 35-37, 45-55). Market entry for new players and shared use of the radio spectrum, in particular, should be easier.
  • A modest attempt to regulate OTT services has been made. Rather than subsume all such services to the full scope of telecoms rules, only some of them are included and only in a limited number of cases. This is, in principle, a good and measured approach.3

We find that the modestly-framed EECC fails to address the primary challenges the EU telecoms is facing today:

  • The EU’s telecommunications capital investments are relatively low and are falling further. Regulatory burden is not the only factor to consider but it seems that it certainly plays a role. Telecoms expenditure is lower for EU27 than for either the USA or some Asian countries. The modest changes that the EECC brings are not structural and not overtly pro-investment.
  • There are reasons to believe that the model based on access and price controls which the EU has chosen may deliver better broadband products in the short to medium term but does not deliver next-generation networks (there is good empirical evidence for this too). The problem of how to reach acceptable NGA deployment and take-up rates remains unaddressed.
  • Overall, the industry still seems over-regulated.
  • The value of EU telecoms companies has halved from 2012 to 2018 while that of the US and Asian companies increased. The cost of rolling-out full fibre and 5G is estimated at €500 billion and is likely to be significantly more. Few incentives are given in the EECC to EU telecoms companies to do this. The EU is investing less in telecoms than anywhere else and, unless more fundamental changes are made, the EU companies will move more into retail.
  1. The ePrivacy Directive is subject to a separate proposal, see here.
  2. For the codified version of the 2009 framework see here.
  3. It should be noted, however, that the proposed ePrivacy Regulation calls for much more comprehensive coverage of OTTs.

EU Proposal for a Regulation Preventing the Dissemination of Terrorist Content Online: An Overview

The Commission announced yesterday a proposal for the Regulation Preventing the Dissemination of Terrorist Content Online. Somebody not following the developments closely might get an impression that this is a stand-alone initiative. In reality, the Proposal follows both a broader drive to regulate platforms, announced in the 2015 DSM Strategy, followed up in the 2016 Communication on platforms, and further elaborated in ‘soft law’ 2017 Communication and 2018 Recommendation on illegal content online.

The desire to regulate “platforms” is, in itself, problematic. Platforms are neither natural subjects for IT regulators (unlike telecoms networks & services or information society services), nor sufficiently clearly defined to lend themselves to straightforward regulation. They differ in size, scope, type and impact and use a vast plethora of business models. The EU’s continuous drive to regulate them without exploring deeper implications of such approach is worrying.

The present Proposal aims to “prevent the misuse of hosting services for the dissemination of terrorist content online”. It imposes duties of care on hosting services to prevent dissemination of terrorist content and measures on Member States to identify the content and remove it. The Proposal has the same wide scope as GDPR, applying to all hosting providers targeting services in the Union, irrespective of their place of establishment.

The definition of terrorist offences is taken from the 2017 Directive on terrorism. Terrorist content, which is the Proposal’s main target, is defined as inciting, advocating, encouraging, promoting or instructing on terrorist offences so defined in the 2017 Directive. Hosting service providers are obliged to take action against dissemination and include provisions to that effect in their terms and conditions.

The interesting (and controversial) part of the Proposal appears in the form of “removal orders” of Article 4. These are non-voluntary demands issued by competent authorities and directed at hosting service providers. National authorities are allowed to require providers to remove content and disable access to it and the latter would be required to do so “within one hour from receipt of the removal order”. This obligation seems to be applicable irrespective of the size of the site or the hours under which it is manned. A statement of reasons is available but only upon request from the provider and cannot, in any case, delay the removal order. If the hosting provider disagrees with the order due to “manifest errors” or because it needs “clarification”, the removal is postponed until such clarification is provided. In addition to the removal orders, the authorities may send voluntary requests called “referrals” (Article 5) which providers are free to assess against their own terms and conditions but need not remove.

Another controversial feature are “proactive measures” that Article 6 demands hosting providers make. The providers need to take “effective and proportionate” measures, “where appropriate”, while assessing risks and taking fundamental rights into consideration. Once a removal order of Article 4 has been issued, though, special proactive measures that relate to the hosting provider which has been the subject of such order kick in, demanding that they submit annual reports about prevention of re-upload and detection, removal and disabling of content. Where deemed insufficient, further proactive measures may be required and imposed forcefully.

Article 8 demands transparency from hosting providers, including the use of terms and conditions while Article 9 requests human oversight of automated removal measures. The rest of the Proposal contains detailed measures on complaints, cooperation, implementation and enforcement.

Particularly interesting is the status of hosting providers located outside the EU. They are, under Article 16, required to designate a legal representative for the purposes of compliance. Under Article 15, where a provider does not have a place of establishment in the EU, it is the place of residence or establishment of the legal representative that is the place relevant for enforcement purposes.

The penalties set out in Article 18 are for Member States to determine but are obliged to punish systematic failures by 4% of the hosting provider’s global turnover.
There are, in my view, three main problems with the proposal:

  • first, the Proposal derogates specifically (Recital 19) from the obligation not to monitor, set out in Article 15 of the E-Commerce Directive. This is a dangerous and poorly justified precedent. Although the recital does speak of “balancing” in such cases, it seems as if a mere fact that a label “terrorist” had been stuck on particular content, at a particular provider, would trigger a derogation the effect and duration of which is unknown and not justified or considered by Article 15.
  • second, complying with obligations in the Proposal, and in particular Article 6, will require the use of filters, since relying on human moderators would be disproportionately expensive. Filtering technology works sporadically and tends to ‘catch’ legitimate content, often opening more problems than it solves.
  • third, the cost-benefit side of the question is obscure in the proposal. Terrorist content tends to appear and disappear quickly and is rarely simply placed on platforms, for them to conveniently remove. It tends to shift from one account to another and from one platform to another. Burdening the hosts with 24/7 removal and monitoring obligations may look justified but is unlikely to give desired results and may, in worst cases, lead to governmental abuse. It is further unclear whether non-EU providers would simply withdraw from the EU (as some did as a result of GDPR’s extraterritorial reach).

In summary, the problem is more complex than the regulator would like to believe and may require creative thinking, the use of co-regulation and technical solutions of the next generation rather than heavy-handed removal and penalty system.

A comment on the JURI Committee Report on Article 13 of the Copyright Proposal and everything that is wrong with it

Silke von Lewinski published today a comment on the JURI Committee Report of the Copyright Proposal. I am reposting here what I posted as a comment to her original post and should be read as such.  I believe that it is important to keep the momentum going as we approach further debate in the Parliament.

First, that platforms are legitimate targets of regulation is by no means a foregone conclusion. That the proposal targets “platforms” rather than “information society services”, and makes a number of assumptions on the way, is a problem of enormous proportions, not just in terms of how well the proposal communicates with the E-Commerce Directive, but also in terms of is relation to other EU IT laws. In short, one cannot just wave a magic wand, assume that all UGC cites are “platforms”, and then happily continue discussing whether such platforms engage in communication to the public, as if this move had no consequences, all of this in almost complete absence of an official EU policy on platforms. This is an issue I discuss in a forthcoming article, (2018) 6 CLSR.

Second, the Proposal simply states that “active” online platforms perform a communication to the public. What an “active” site is has been debated ad nauseam for over two decades now, but is still not a point which has been adequately resolved in CJEU case law, which is obscure and can often only be read in its particular context. One cannot equate TV sets in hotel rooms with streaming services or cloud depositories. Neither can one use Pirate Bay – an openly and proudly illegal distributor – as a comparison point for a streaming/distribution platform, no matter how active the latter might be. This problem is about communication to the public of a bona fide platform, and Pirate Bay case is of limited to no use here.

Third, the main problem in this issue is not whether a “platform” (for what this term is worth) can technically be deemed to fit within various bits and pieces of law on communication to the public but whether it makes sense to dump all online distributors into the same category and burden them with regulation and filtering obligations in almost complete absence of harmony with the spirit and letter of ECD Articles 12-15. The ISP liability regime has been designed with a very specific idea of avoiding indiscriminate filtering and cannot be changed by simply labelling all online distributors as active – a presumption of huge consequence.

Fourth, while there is no doubt that Article 14 was never meant to protect knowledgeable providers or primary infringers, the Voss proposal sharpens the already muddled Preamble 38 by making what is essentially an irrebuttable presumption of knowledge for all UGC ISPs. Whereas the original Proposal talks of communication only where there are activities “beyond the mere provision of physical facilities”, the Voss text simply states that all online content sharing providers are communicators. Why would they be so and, even where they are, why would they exercise prior rather than subsequent restraint? I will not even begin on the subject of how broad the definition of 37a really is. This subjects a vast and ill-defined category to a poorly designed and heavily-lobbied-for set of expensive filtering. Surely, this cannot have been the intention of the original drafters of InfoSoc Directive?

Fifth, the main problem of the both the original Proposal and the Voss text is that indiscriminate and expensive filtering of the kind only Google can pay for is demanded. All this in complete contradiction to SABAM case which is very clear on this (but is not discussed in the article above).

Finally, the Proposal is incredibly badly harmonised with the spirit and letter of the E-Commerce Directive. I am afraid that no amount of careful interpretation of the communication to the public will resolve this. The Commission has been told, a vast number of times, by very many people that this is a problem of gigantic proportions. It has so far chosen to ignore it, which fortunately led to its defeat in the early summer. Let us hope that the Proposal will be defeated once and for all in September so that we can begin a serious and much-needed discussion on the reform of EU copyright law.

The Commission’s Proposal on Fairness and Transparency for Business Users of Platforms – Online Platforms to Be More Transparent, Disclose Ranking Criteria

On April 26, the Commission published a proposal for a Regulation on promoting fairness and transparency on platforms. This proposal follows the 2015 EU Digital Single Market Strategy promise to look into platforms and a Communication on Online Platforms the Commission published in 2016. While these documents did not promise an overarching law on platforms (covering both the B2C and B2B situations), they signalled the Commission’s willingness to look at a set of targeted issues and introduce legislation if and where necessary. The present proposal aims to improve transparency in B2B relations involving platforms on one side and businesses which otherwise provide services to other businesses and consumers through such platforms on the other. The main idea is that platforms act as “gatekeepers” of the online world, effectively bringing dependency for many businesses. The present proposal brings extra requirements for clarity and transparency in situations where platforms could abuse their dominance. It seems that the choice of transparency (rather than blacklisting certain practices) as the Commission’s main tool is meant to fulfil proportionality and subsidiarity criteria, leaving further action to Member States and/or competition and marketing laws. The Proposal is not meant to be full harmonisation.

The regulation would apply to online intermediaries and search engines (hereafter “platforms”) providing services to business users with a place of establishment in the EU and offering goods and services to EU consumers. It is irrelevant for the purposes of the Proposal if platforms themselves have a place of business or residence in the EU, as long as business users they offer services to have such a place and target EU consumers.

An intermediation service, which is in the Proposal placed under transparency and fairness obligations, is defined as a) an information society service (ISS) within the meaning of the E-Commerce Directive, b) that facilitates direct transactions between business users and consumers and c) which are provided to business users on the basis of an underlying relationship between online platforms and others (businesses and consumers). A provider can either be a natural or a legal person. Online search engines are defined simply as digital services allowing users to perform search. The definition of ISSs seems very broad as these are, in reality, all electronic services provided at distance and with remuneration (the latter not necessarily in the form of a payment). Additionally, the majority of ISSs do facilitate B2C transactions.

The obligations to be introduced can be summarised as follows:

  • Article 3 requires that platforms increase transparency of their unilaterally drafted terms and conditions, and in particular any modifications made to them.
  • Article 4 requires that any suspension and termination of platform services be accompanied by a statement of reasons.
  • Article 5 brings potentially the most significant changes. Online platforms would be required to set out in their terms and conditions the main parameters determining ranking and the reasons for their relative importance. Additionally, where businesses have an opportunity to pay to influence ranking, such possibility would also have to be disclosed. Online platforms are not required to disclose any trade secrets as defined in EU Trade Secrets Directive.
  • Article 6 obliges intermediaries to disclose in their terms and conditions any preferences given to goods and services they themselves (or businesses they control) offer. This covers situations where platforms offer and promote their own prducts, as was the case with Google comparative shopping.
  • Article 7 requires the disclosure (description) in terms and conditions of any use of data that business users and consumers disclose to platforms.
  • Article 8 requires that platforms that prohibit provision of same good and services through other means than through platforms to disclose such restrictions. This covers situations where platforms demand exclusivity from their business customers.
  • Article 9 introduces an internal system for complaint-handling.
  • Articles 10-11 introduce the possibility of using mediation.
  • Article 12 allows organisations with a legitimate interest representing business users and public bodies in Member States to take action to stop or prohibit non-compliance. Organisations in question need to be non-profit.

While transparency requirements in some of the articles seem superficially reasonable, I see two major problems with the proposal.

The first is its extent. While there is some agreement as to what constitutes a search engine,1 the same cannot be said of intermediaries. Even if one assumes that most intermediaries that fall under the E-commerce Directive would also fall under the Proposal, there is still scope for questioning the reasonableness of such proposal.

The second problem is the potentially very wide scope of Article 5. Ranking criteria are not determined on a whim but are a result of algorithms which are a business asset and a trade secret. While the proposed Article does not require disclosure of anything that the Trade Secrets Directive itself does not consider a trade secret,2 it seems to be impossible to release meaningful information on search ranking criteria without also releasing trade secrets.

It remains to be seen whether the proposal will move smoothly through the legislative procedure. This seems very unlikely, as heavy lobbying is expected from a variety of platforms. This brings into question the meaningfulness of the Commission’s exercise. Under the circumstances, a more vigorous application of competition law would have probably achieved the same result.

  1. Although no full agreement. Is Facebook (also) a search engine? Instagram?
  2. Secret and not generally known, with a commercial value and subject to controller’s reasonable steps to keep it secret.

Bad Reporting of EU Tech Policy and Regulation in the Media: How to Recognise it and How to Get the Real Picture

While there can be no doubt that EU tech policy cannot be considered the sexiest topic on the planet, it is nevertheless covered very frequently in the media. In the era of privacy leaks, government surveillance, AI and streaming, to name but a few, it is also very relevant. It is not surprising, therefore, that reputable newspapers often cover real and imagined EU activities with vigour. A careful reader would, however, find that articles written by experienced journalists often carry sensationalist titles. Even more surprisingly, such a reader would have difficulties confirming the article’s claims on EU’s own portals. The reporting is often incoherent, confuses different policy areas and rarely quotes original sources or points the reader in the wrong direction. This was particularly obvious in the recent wave of coverage of global platforms such as Google and Facebook. Here are some of the examples:

On March 14, 2018, Reuters carried a story with a bombastic title: “Google, Apple face EU law on business practices”. The article claims that the EU is “drafting a new regulation”, that this regulation will be “specifically targeting online platforms” and that it will ask companies to be more transparent on how they rank search results. The same story was subsequently carried, almost verbatim, by a number of newspapers and online portals, including the Financial Times, the EU Bulletin and Business Review. Anybody being even vaguely familiar with the EU’s Digital Single Market project would be surprised at such statements. This is more so since the Commission’s own 2016 Communication on Platforms states that no new directive or regulation on platforms is forthcoming nor does it list search engine ranking algorithms as being under scrutiny. While the Commission is not organised in the way it presents its activities, it rarely acts without proper announcement.

What is the origin of the story, then? The mystery is not very deep. A careful reader would notice that a new EU package on consumer law appeared on April 11. Part of that package is a proposed Directive updating a bunch of EU consumer laws (informed, among other things by a behavioural study on transparency in online platforms), including the 2005 Unfair Commercial Practices Directive. Suggested as one of the changes is the new Article 6a, applying to online marketplaces. It says that “the main parameters determining ranking of offers presented to the consumer as result of his search query on the online marketplace” must be disclosed to consumers or the practice would be considered an unfair one. More interesting, however, is the Commission’s work on fairness in platforms-to-business relations. As part of the 2015 Strategy’s promise to look at platforms, the Commission promised to look at transparency of trading practices of online platforms. The inception impact assessment, published in late 2017, looks at three legislative options: soft law, targeted EU instrument coupled with industry-led action or detailed EU principles. Whoever wrote the original Reuters article would have seen the work on the consumer package and/or seen the transparency document on fairness in platform-to-business relations and may have also seen a draft document as an outcome of one of the three options. They then constructed the story which, while containing elements of truth, is still considerably off the mark. It is also worth noting that even modestly controversial EU proposals get significantly modified and are occasionally withdrawn where the Council and the Parliament cannot reach an agreement. What will happen, at best, is that a proposal will be tabled and discussed at length.

Another article appeared in Financial Times on April 17. It claimed that EU is “to give judges power to seize terror suspect emails and texts”. It begins with the familiar “Brussels is planning” words and goes on to say that judges will get “the power to seize emails and text messages of terror suspects” and that the Commission “will propose giving national judges the extraterritorial power to order companies to hand over ‘e-evidence’ held in servers in another EU country or outside of the bloc.” While this sounds both dramatic and controversial, readers would be hard pressed to understand which regulatory platform this proposal falls under, which DGs might be involved, where to find the policy document, how to follow the developments or how this proposal fits in the broader Digital Single Market Strategy. A much more coherent (albeit shorter) account of the affair is to be found on Politico’s website. There, it is clarified that the proposal is part of the Commission’s drive to improve taking of e-evidence and outlines its main features as well as differences from the current regime. Crucially, the article identifies that this is part of the effort to achieve a “common judicial area” and that it only applies to serous crime. A casual look at the Commission’s website on taking cross-border evidence would reveal that this is indeed part of an ongoing effort in cross-border judicial cooperation, that “European Investigation Order” is already in force and that work continues on the “European Production Order”. While the FT article is not entirely wrong in its reporting, the value it adds to the debate is minimal.

Similar stories are to be found all over the Internet. It is usually easy to recognise them as they are rarely specific and are usually framed in the clichéd “Brussels plans” or “EU threatens” style. How is the reader to find out what is really behind the story?

The first step is to identify the area the tech policy belongs to. Very roughly, three regulatory circles exist in the EU’s Digital Single Market effort. Information society services are regulated under the e-commerce framework (here also including copyright, privacy, speech regulation, etc.) Telecoms regulatory framework covers telecoms networks and services. Finally, media under editorial control are subject to Audio-Video Media Services Directive (AVSMD). The circles, although distinct, overlap. Privacy, for example, is subject both to e-commerce rules (GDPR) and telecoms rules (ePrivacy regulation) which means that a full picture can only be obtained by looking at both.

The second step is to identify the general EU policy in the area in question. The 2015 Digital Single Market Strategy should always be consulted as should any other statements that further clarify the political EU position of a particular issue. For platforms, for example, there exists the Communication (mentioned above), but also the 2017 Communication and the 2018 Recommendation on Illegal Content online. The formation of a potential proposal is always followed by various studies the Commission publishes (usually outsourced to think tanks) as well as stakeholder consultations, speeches and announcements on web sites. The platform law above, for instance, is preceded by the already-mentioned inception study and the behavioural study on transparency.

The third step is to check whether any policy ideas identified in steps 1 and 2 have been transformed into legislative proposals. Privacy, consumer protection, cybersecurity, copyright, telecoms, AVMS services and e-commerce all have their own websites (sometimes multiple ones at different institutions). These not only summarise present laws but explain policies and plans for future regulation. Further to that, stakeholders are often engaged in commenting on relevant proposals, both officially (as part of the Commission’s fact-finding exercise) and in their own reports, blogs, etc. Such documents are invaluable for gauging public and professional opinion. The final step would be to confirm that proposals are about to become laws. Existing proposals can be traced through the Legislative Observatory.

While the above may seem banal, it is rarely followed, which results in confusing and messy reporting. Although it may sometimes seem that proposals come “out of the blue”, this is in reality almost never the case as significant preparatory work is needed and the Commission launches initiatives based on plans announced in strategy documents. In other words, the Commission maintains a certain degree of transparency for political and budgetary reasons and has no political mandate to randomly tackle issues not otherwise announced as part of a wider political agenda. In such circumstances, it becomes very important to place the news in their proper context.

The EU Reform of Consumer Law: A Quick Overview of the “New Deal”

After promising a comprehensive revision of consumer law in September 2017, and in light of the fitness check published in May 2017, on 11 April 2018 the Commission published the New Deal for Consumers. In justifying the need for the reform, the Commission states that, while the current substantive rules are “fit for purpose, their effectiveness is hindered by lack of awareness and by insufficient enforcement and consumer redress opportunities.“ The new package is mainly designed to stop and deter infringements (more effective prevention) and ensure redress when needed (better enforcement).

The present EU consumer law framework consists of a patchwork of rules dating to different periods. It has largely not been touched since the 2011 Consumer Rights Directive had been adopted.1 While this framework functions relatively well,2 it has not been subject to serious updates for eight years. In the meantime, new challenges have arisen, including an unprecedented rise in online sales, digitization of the society and more aggressive profiling and tracking. The New Deal brings the framework in line with the new digital reality.

The reform consists in a Communication and two proposals for Directives:

  • Communication on A New Deal for Consumers. This document simply summarises the status quo, states the objectives and explains what each of the two directives do.
  • Proposal for a Directive on representative actions for the protection of the collective interests of consumers. This directive introduces a collective redress right in cases where groups of consumers have suffered harm. Consumer organisations would, under this proposal, be able to demand redress.
  • Proposal for a Directive on better enforcement and modernisation of EU consumer protection rules. This proposal aims to modernise consumer law, in particular enforcement. It does so by modifying the 1993 Unfair Terms, the 1998 Consumer Protection, the 2005 Unfair Practices and the 2011 Consumers Rights directives.

The changes that the proposed directives bring can be summarised as follows:

  1. In terms of the new collective redress: Qualified entities will be able to commence actions representing consumers’ collective interests. The proposed Directive is not a full harmonisation, leaving Member States the opportunity to introduce other collective remedies. The Directive would apply to infringements of EU law involving consumers’ collective interests and listed in the Annex. Collective interests are defined simply as those affecting multiple consumers (e.g mass recalls, delays, faults, etc.) The Directive does not change Member States’ laws regarding other available actions, nor does it affect private international law issues (other than to make class actions possible). Qualified entities are to be designated by Member States.
  2. 2005 Unfair Commercial Practices Directive: a new right to individual remedies for consumers is introduced in Article 11a and the rules on penalties have been strengthened. While the 2005 Directive prohibited a number of practices, it did not introduce specific redress, leaving this issues to the Member States. This is changed in the new proposal. As for penalties, a list of common, non-exhaustive criteria for assessing the gravity of infringements is introduced in Article 13 of the Directive. Particularly significant, however, for ‘widespread infringements’ and ‘widespread infringements with a Union dimension’, Member States will be required to introduce law fines for the maximum amount of at least 4% of the trader’s turnover. In terms of paid placements and paid inclusion, the proposal asks that search results in response to the consumer’s online search query also be included as unfair commercial practice when such paid inclusion is not declared.
  3. 2011 Consumer Rights Directive: the proposal has a number of important additions. It brings the Directive in line with the ‘digital content’ and ‘digital services’ as defined in the proposed Digital Content Directive. A new Article 6a provides specific additional pre- contractual information requirements for contracts concluded on online marketplaces (including ranking parameters, whether the party is a trader, consumer rights arising, etc.), thus increasing consumer transparency in online marketplaces. The proposal also amends the Directive in terms of a number of other issues between traders and consumers, effectively removing unnecessary burden from businesses.
  4. 1993 Unfair Contract Terms: a new article on penalties in inserted.
  5. 1998 Price Indication Directive: article on penalties is amended.

While it is difficult to give a comprehensive assessment of the changes at this early stage, it seems that the new provisions on collective redress and significantly increased fines would have an immediate effect. EU consumer laws are already among the most robust in the world and this reform would both update them and strengthen them.3 On the other hand, it also seems clear that more frequent and increasingly invasive privacy incursions, coupled with ever more subtle modes of tracking consumers and influencing their behaviour, warrant new approaches. It is, therefore, questionable to what extent the EU framework is capable of answering these challenges.

  1. The 1993 Unfair Contract Terms Directive, the 2005 Unfair Commercial Practices Directive and the 2011 Consumer Rights Directive are the key laws.
  2. In that no open calls for its thorough reform have been heard.
  3. This is in addition to new privacy rules, coming into effect in May 2018.

Rule by Decree Part II: How the Commission Undermines Rule of Law by Attempting to Regulate Online Content by Issuing Edicts

In its most simple form, the rule of law can be defined as a way of restricting arbitrary power by subjecting everyone to democratically passed and fairly applied and enforced laws. Such laws are subject to proper procedure and judicial oversight. Rule by Decree, on the contrary, is represented by quick and unaccountable creation by laws favoured by despots.

In March 2017, I have commented on this blog, on the Commission’s attempt to regulate large platforms such as Facebook, Google and others by politely asking them to take action in the field of unfair contracts terms and removing fraud & scams. I have commented at that point that, rather than giving the Commission ex ante enforcement powers, the Treaty only gives it the right to propose laws and to occasionally partake in oversight of the ones that already exist. Where such laws do not exist, because the proper agreement lacks, they cannot and should not be replaced by letters demanding action from individual corporations.

In September 2017, the Commission extended its ambition to regulate platforms by proposing to monitor illegal content on online platforms. In that dubious document, the Commission asserted that “what is illegal offline is also illegal online” and demanded that “platforms” step up the fight against illegal content online. In summary, the document demanded that platform do more to tackle illegal content and take “proactive measures” to detect and remove such content. The Commission kept repeating that the proposed demands do not conflict with the limitations of ISP liability of Articles 12-15 of the E-Commerce Directive but did little to explain how this conflict could be resolved in practice.1 Further to that, different kinds of “illegal” content were all bundled under one title thus putting such issues as child pornography, hate speech, terrorism, commercial fraud and copyright infringement all under one hat. Already at that point, the Commission had been warned that its position is contradictory and dangerous for free speech.

Good laws cannot be replaced by decrees.

Having learned little from the criticism, the Commission marched on and issued a set of operational measures (in the form of Recommendation) leaning on the September 2017 Communication and targeting “companies and Member States” and threatening legislation if these fail to be effective. In it, the Commission calls for

  • Clearer ‘notice and action’ procedures
  • More efficient tools and proactive technologies
  • Stronger safeguards to ensure fundamental rights, in particular when automated tools are used for removal
  • Special attention to small companies
  • Closer cooperation with authorities

Special rules are introduced for “terrorist” content online and this inlcudes

  • the obligation to remove such content within an hour of referral
  • faster detection and removal
  • better referral
  • regular reporting to the Commission, in particular about referrals

While the criticism concerning the September 2017 communication still stands, new concerns have to be added. They can be summarised as follows:

  1. The Guidelines lack effective safeguards against abuse. Points 1.11 and 1.12 require only that content providers be given the right to contest the decision “within a reasonable time period“ and through a “user-friendly” mechanism. Point 18 encourages “proportionate and specific proactive measures” in tackling illegal content in particular by automated means. Points 19 and 20 call for “effective and appropriate safeguards” but, essentially, leave this issue to Member States, further increasing disparity between them and fostering confusion. Point 21 unhelpfully and vaguely calls for protection against abuse without being specific on what this protection might constitute in.
  2. The Guidelines are in direct conflict with Article 15 of the E-Commerce Directive. That article explicitly states that Member States shall not impose a general obligation to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity. While it is perfectly possible and allowed, in a civil and democratic society, to repeal this article, this has not happened yet.
  3. The Guidelines bundle different kinds of illegal content. Illegal content is defined in point 1.4.b as “any information which is not in compliance with Union law or the law of a Member State concerned”. The definition is unacceptably wide. Furthermore, it conflates content which is illegal at EU level (because of harmonisation efforts) with material that is illegal only in some states which, in turn, also brings subsidiarity into play (for, why would Member States transfer issues concerning locally-illegal content onto EU). Further to that, copyright, hate speech, child pornography and terrorist content require vastly different monitoring and enforcement tools. It makes no sense to treat them as one.
  4. Finally, and most importantly, good laws cannot be replaced by decrees. The Commission understands all too well that an effort to pass legislation on hate speech alone (never mind anything else covered by the Recommendation) would require lengthy public consultations, acrimonious disputes between stakeholders and endless negotiations as texts progress through various committees in the effort to find a compromise between the Parliament and the Council. Whereas soft law is a legitimate and useful governance tool, it is defined by its non-binding character.2 A recommendation that threatens legislation unless it reaches its desired effect is not soft law, it is a decree. We should remind the Commission that the reason why EU Internet laws have been a success3 is precisely because democratic procedures have been respected. It may be tempting to make shortcuts but this undermines the legitimacy of EU institutions and will ultimately do little to combat illegal content.
  1. In fact, it got into trouble for exactly the same reasons with another badly-formulated proposal – Copyright in the Digital Single Market.
  2. As evidenced, among others, in Article 288 TFEU.
  3. The E-Commerce and InfoSoc directives have lasted over 17 years while the Data Protection Directive is over 20 years old.