EU Digital Services Act
Intermediary services
The sole addressees of the Digital Services Act are so-called intermediary services. The definition of an intermediary service is somewhat broader than the wording might suggest. Such a service exists in the case of
a "mere conduit" consisting of transmitting information provided by a user in a communications network or providing access to a communications network,
a "caching" service, which consists of transmitting information provided by a user in a communication network, whereby an automatic, time-limited intermediate storage of this information takes place for the sole purpose of making the transmission of the information to other users more efficient at their request,
a "hosting" service, which consists of storing information provided by a user on their behalf.
Objective of the Digital Services Act
The stated aim of the Digital Services Act is to "establish consistent rules for a safe, predictable and trusted online environment in which the fundamental rights enshrined in the [EU Charter of Fundamental Rights] are effectively protected."
Very large online platforms
The Digital Services Act contains additional regulations for so-called "very large online platforms", which only intervene when the average monthly number of active users amounts to at least 45 million people. This means that there is a certain degree of differentiation according to the size of the online platform, at least to some extent, and this results in certain simplifications for start-ups and small and medium-sized enterprises (SMEs).
Limitations of liability
Depending on the classification of the service in question, the Digital Services Act sets out different requirements and also different liability provisions.
If an intermediary service exists in the form of a pure conduit, liability is excluded with regard to the transmitted information if
the service provider does not initiate the transmission,
the service provider does not select the addressee of the transmitted information and
the service provider does not select or change the transmitted information.
If an intermediary service exists in the form of a caching service, liability is excluded with regard to automatic, temporary intermediate storage, which serves the sole purpose of making the transmission of information to other users more efficient at their request, if
the service provider does not change the information
the service provider complies with the conditions for access to the information
the service provider complies with the rules for updating the information set out in widely recognized and used industry standards,
the service provider does not interfere with the permitted use of technology to collect data on the use of the information as set out in widely recognized and used industry standards, and
the service provider acts expeditiously to remove or disable access to information it has stored as soon as it has actual knowledge that the information has been removed from the network at the original point of origin of the transmission or access to it has been disabled or a court or administrative authority has ordered its removal or disabling.
In the case of an intermediary service in the form of a hosting service, liability with regard to the stored information is excluded if the service provider
has no actual knowledge of unlawful activity or illegal content and is not aware of any facts or circumstances from which unlawful activity or illegal content becomes apparent with regard to claims for damages, and
as soon as he becomes aware or becomes aware of it, he acts swiftly to block access to the illegal content or to remove it.
In addition to the provisions on "release from liability", it is pleasing that there is expressly no general obligation for proactive monitoring or investigation.
Regulatory requirements
However, the Digital Services Act does not only contain regulations that limit liability, but also obligations regarding the design of intermediary services, for example regarding the design of online platforms. These include, for example
Specifications for content in the General Terms and Conditions (GTC), in particular information on all policies, procedures, measures and tools used to moderate content, including algorithmic decision-making and human reviews;
Establishment of contact points for direct, electronic communication with authorities;
Publication of annual transparency reports
Prohibition of so-called "dark patterns";
stricter requirements for the transparency of recommendation systems, i.e. in particular the algorithmic design of the content displayed, for example via a "timeline".
In addition, there are numerous other requirements depending on the type of intermediary service and the size of the online platform. Various official powers and responsibilities are also regulated, including the right to conduct on-site searches.
Sanctions
Violations can be sanctioned with fines of up to 6% of the brokerage service's annual turnover.
This article is part of the overview of the current changes to the EU data strategy and the New Legislative Framework. Please note that the legislative proposal is currently a draft (albeit marked as "final"). It is therefore not yet applicable law and there may still be changes in the legislative process. However, due to the manageable "transitional periods", it is already necessary to "take a look" at the upcoming law.