On Wednesday, 16 December 2020, the European Commission published the much-anticipated Digital Services Package, aimed at modernising the EU’s framework for digital services. The proposals are a response to the rise of online platforms, which have significantly changed the EU market for online products and services. The Digital Services Package will revise the E-Commerce Directive (2000/31/EC) and represents a significant new chapter in the EU’s approach to regulating the online sale of products and digital services. It will have far-reaching effects for online platforms and marketplaces, as well as a broad range of businesses that sell products and services online. To put this in perspective, the European Commission compared the level of technological disruption caused by the rise of online platforms to the invention of the car – and suggested that the Digital Services Package will be akin to the invention of the traffic light. The intention is very much to control online traffic, and red light behaviours perceived as non-compliant.

The European Commission acknowledges that online platforms have created “significant benefits” for consumers and innovation, facilitating cross-border trading within and outside the EU.  However, the Commission also suggests that online platforms have brought new risks and has particular concerns they can be used as a vehicle for disseminating illegal content, selling unsafe or counterfeit goods or providing illegal services online. It also believes that large platforms should play a greater role, as “gatekeepers” to online markets.

The proposed Digital Services Package comprises two pieces of draft legislation:

  1. The Digital Services Act (DSA) establishing a common set of regulatory responsibilities for providers of online intermediary services (such as online platforms like social networks or online marketplaces); and
  2. The Digital Markets Act (DMA) establishing rules for large online platforms which act as “gatekeepers”, to ensure that markets impacted by them remain fair and competitive and to promote innovation, growth and competitiveness, particularly of European digital innovators, scale-ups, SMEs and new entrants to the market

This blog provides our initial thoughts on the draft DSA and its implications for product manufacturers and suppliers. Given the significance of the proposals, we will be following up with more detailed commentary over the coming weeks, and hosting a Productwise webinar to discuss the proposals in the New Year.

For now, here is our take on the key parts of the draft DSA.

  • Extra-territorial effect: the obligations under the DSA extend to online services providers established anywhere in the world that offer their services in the EU evidenced by a “substantial connection to the EU”.
  • Definition of illegal content: ‘illegal content’ is defined broadly to mean “any information, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law”. There was no similar definition in the E-Commerce Directive, nor did it even refer specifically to products (though the Court of Justice of the European Union determined they were within scope). The Recitals explain that this definition captures the sale of non-compliant products with EU laws, counterfeit products or activities involving infringements of EU consumer protection law (among other examples). These are key areas of focus for the EU – and a big driver for reform in this area.
  • Bolsters rules on liability: the proposals in the DSA preserve the safe harbours that existed under the E-Commerce Directive, but codifies existing case law to clarify the scope and the distinction between active and passive online intermediaries. The DSA specifically removes the protection of the hosting safe harbour for liability under consumer protection laws if online platforms present information or enable transactions in a way that would lead an average and reasonably well-informed consumer to believe the information, product or service is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. In effect, the active/passive distinction is subject to an objective test of reasonableness.
  • New obligations for online platforms including:
    • Know Your Business Customer” checks, which will have to be carried out by online platforms to improve traceability to help identify sellers of unsafe or counterfeit goods. The information required includes proof of identification and a self-certification by the trader committing to only offer products or services that comply with the applicable EU laws. Online platforms will have to make reasonable efforts to assess the reliability of certain traceability information.
    • Notice-and-action” mechanisms, which must be established by providers of hosting services, including online platforms, so users can notify them of potentially illegal content, including unsafe or counterfeit goods. The DSA sets out the information required in a sufficiently precise and substantiated notice (explanation for illegality, electronic location of illegal content, name of submitter and statement confirming good faith submission). Where notices fulfil these criteria, the information will be considered to give rise to actual knowledge of the illegal content – which is sufficient to remove the benefit of the hosting safe harbour, meaning the provider could be liable for the content.

      Notices submitted by “trusted flaggers” will be treated with priority. Trusted flagger status can be awarded by a Digital Services Coordinator of a Member State based on set criterion e.g. the flagger has a particular expertise in relation to the illegal content and represents collective interests.
    • More information for consumers. The DSA contains obligations for online platforms to publish more information about the identity of traders and to design and organise their interfaces in a way that enables traders to provide required pre-contractual information and product safety information under EU law (e.g. point-of-sale warnings or labels).
  • New obligations forvery large online platforms” (defined as those who provide services to an average of 45 million recipients a month) will be subject to additional obligations, which apply cumulatively to the obligations for other platforms. These include carrying out a risk assessment of their vulnerability in respect of unsafe or counterfeit goods being sold on their platforms. They will then be required to put in place “reasonable, proportionate and effective” mitigation measures, tailored to these risks. Examples of these measures are set out in the DSA. In addition, “very large online platforms” will be required to appoint compliance officers responsible for monitoring compliance with the DSA and will also need to have independent audits carried out (at their own expense) to assess compliance with the obligations of the DSA.
  • Creation of the European Board for Digital Services: the DSA establishes an independent advisory group for the Digital Services Coordinators with powers to support the coordination of joint investigations, support competent authorities in the analysis of reports and results of audits of very large online platforms and to issue opinions, recommendations or advice to Digital Services Coordinators.
  • New information and transparency obligations: the DSA proposes a range of new obligations, with a base-level applying to all providers, additional obligations for hosting services, online platforms and “very large online platforms”. The obligations include (among others):
    • establishing a single point of contact to facilitate direct communication with Member States’ authorities, the Commission and the Board for Digital Services;
    • providers that offer services in the EU, but are not established in any Member State, must establish a point of contact and designate a legal representative in the EU;
    • providers must include certain information in the terms and conditions, such as any restrictions imposed on the use of their services and the policies, procedures, measures and tools used for content moderation, including algorithmic decision-making and human review;
    • yearly reporting obligations about the removal and disabling of information considered to be illegal content or contrary to the terms and conditions of the provider; and
    • online platforms must disclose information about algorithms used for recommendations.
  • New enforcement powers and desire for cross-border initiatives: the proposals include a requirement for a Digital Services Coordinator to be established in each Member State with powers to participate in joint cross-border oversight and investigation activities. Public authorities will be given “new tools” to order the removal of unsafe products directly, such as “codes of conduct”. The European Commission will have specific powers for supervision, investigation, enforcement and monitoring of “very large online platforms”. The powers include carrying out on-site inspections during which the Commission and auditors or experts appointed by it may (among other things) ask for information about an organisation, its functioning, IT system, algorithms, data-handling and business conduct.
  • Fines of up to 6% of global annual turnover and periodic penalty payments of 5% of average daily turnover can be levied against “very large online platforms” for certain breaches of the DSA including supply of incorrect, incomplete or misleading information in the context of the investigation.  

Next steps

The DSA and DMA legislative proposals will now make their way through the ordinary legislative procedure, involving scrutiny by the European Parliament and European Council. The intention is to complete this process in a year, or a year and a half, with the Regulations entering into force six months after adoption. This means that we expect to see these new rules applying throughout the EU during 2022.

Posted by Edward Turtle