Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending
Directive 2000/31/EC (COM(2020)0825 C9-0418(2020) 2020/0361(COD))
COMPROMISES
CA 1
Amendment 257
Irena Joveva
Amendment 258
Laurence Farreng
AM 43 Rapporteur
Proposal for a regulation
Article 1 – paragraph 5 – point c
Union law on copyright and related rights
, in particular Directive (EU) 2019/790 on copyright and related rights in the Digital Single
Market, as implemented in national laws in line with the Directive.
CA 2
Amendment 259
Sabine Verheyen
Amendment 260
Petra Kammerevert, Christel Schaldemose
1
Amendment 261
Marcel Kolaja
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Member States competences to adopt legislation aimed at protecting or promoting freedom of expression and information, media
freedom and pluralism and cultural and linguistic diversity addressed to providers of intermediary service shal not be affected by this
Regulation if this is deemed necessary to ensure, protect and promote the freedom of information and media or to foster the
diversity of media and opinion or of cultural and linguistic diversity.
CA 3
Amendment 45 Rapp
Amendment 262 Petra Kammerevert
Proposal for a regulation
Article 1 – paragraph 5 b (new)
Any contractual provisions between an intermediary service provider and a trader, business user, or a recipient of its service, which
are contrary to this Regulation shal be invalid. This Regulation shal apply irrespective of the law applicable to contracts concluded
between providers of intermediary services and a recipient of the service, a consumer, a trader or business user.
2
CA 4
Amendment 46
Rapporteur
Amendement 66
Rapporteur
Amendement 67
Rapporteur
Amendment 263
Rapporteur
Amendement 264
Rapporteur
Amendment 271
Petra Kammerevert, Christel Schaldemose
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
“Editorial content provider” means the natural or legal person who has editorial responsibility for the content and services they offer,
determines the manner in which it is organised, who is subject to sector- specific regulation, including self-regulatory standards in the
media and press sectors and has put in place complaints handling mechanisms to resolve content-related disputes.
3
CA 5
Amendment 273
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Amendment 52 Rapp
Proposal for a regulation
Article 5 a (new)
Providers of intermediary services shal be deemed ineligible for the exemptions from liability in line with art 3,4 and 5 and liable to
pay penalties in accordance with art 42, when they do not comply with the due diligence obligations set out in this Regulation.
CA 6
Amendment 279
Petra Kammerevert, Christel Schaldemose
Amendment 299
Sabine Verheyen
Amendment 303
Sabine Verheyen
Proposal for a regulation
Article 7 a (new)
4
Prohibition of interference with content and services offered by editorial content providers
Intermediary service providers shal not remove, disable access to or otherwise interfere with content and services made available by
editorial content providers as defined in Art.2 – paragraph 1 – point q a (new).
Editorial content providers’ accounts shal not be suspended on the grounds of legal content and services they offer. This Article shal
not affect the possibility for an independent judicial or independent administrative authority in line with Directive 2010/13/EC of
requiring the editorial content provider to terminate or prevent an infringement of applicable Union or national law.
CA 7
Amendment 297
Dace Melbārde
Amendment 298
Marcel Kolaja
Amendment 306
Marcel Kolaja
5
Amendment 309
Marcel Kolaja
Proposal for a regulation
Article 12 – paragraph 1
Terms and conditions of providers of intermediary services shal respect the principles of human rights as enshrined in the Charter and
international law. Providers of intermediary services shall include and
publish information on any restrictions
or modifications that they
impose in relation to the use of their service in respect of information provided by the recipients of the service
, in their terms and
conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content
moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be
publicly available in an easily accessible format
and machine-readable format
in the language in which the service is offered.
Providers
of intermediary services shal inform the recipients of their services of changes to their terms and conditions in a timely manner.
CA 8
Amendment 300
Marcel Kolaja
Amendment 304
Irena Joveva
Proposal for a regulation
Article 12 – paragraph 1 a (new)
6
Providers of intermediary services shal publish summary versions of their terms and conditions in a clear, user-friendly and
unambiguous language, and in an easily accessible and machine-readable format. Such a summary shal include the main elements of
the information requirements, including the possibility of easily opting-out from optional clauses as wel as information on remedies
and redress mechanisms available, such as to modify or influence the main parameters of recommender systems and advertisement
options;
CA 9
Amendment 301
Marcel Kolaja
Amendment 302
Petra Kammerevert, Christel Schaldemose
Proposal for a regulation
Article 12 – paragraph 2
Providers of intermediary services shall act in a
coherent, predictable, non-discriminatory, transparent, diligent,
non-arbitrary and
proportionate manner in applying and enforcing the restrictions referred to in paragraph 1,
in compliance with procedural safeguards
7
and with due regard to
national and Union law and the rights and legitimate interests of all parties involved, including the applicable
fundamental rights of the recipients of the service,
in particular the freedom of expression and information, as enshrined in the
Charter.
CA 10
Amendment 299
Sabine Verheyen
Amendment 305
Petra Kammerevert, Christel Schaldemose
Amendment 307
Petra Kammerevert, Christel Schaldemose
Amendment 310
Marcel Kolaja
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Terms and conditions, or specific provisions thereof, community standards or any other internal guidelines or tools implemented by
an intermediary service provider shal not be applied contrary to Article 7a.
Providers of intermediary services shal ensure that their terms and conditions as wel as other policies, procedures, measures and
tools used for the purpose of content moderation are applied and enforced in such a way as to prohibit any removal, suspension,
disabling access to or otherwise interference with editorial content and services of a editorial content provider or their account on
the basis of the legal content they offer. This Article shal not affect the possibility for an independent judicial or independent
administrative authority in line with Directive 2010/13/EC of requiring the editorial content provider to terminate or prevent an
8
infringement of applicable Union or national law.
Intermediary service providers shal notify editorial content providers pursuant to Article 7a beforehand of any proposed changes to
their general terms and conditions and to their parameters or algorithms that might affect the organisation, presentation and display
of content and services.
The proposed changes shal not be implemented before the expiry of a notice period that is reasonable and proportionate to the
nature and extent of the proposed changes and their impact on editorial content providers and their contents and services. That
period shal begin on the date on which the online intermediary service provider notifies the editorial content providers of the
proposed changes.
The provision of new content and services on the intermediary services before the expiry of the notice period by a editorial content
provider shal not be considered as a conclusive or affirmative action, given that such content is of particular importance for the
exercise of fundamental rights, in particular the freedom of expression and information.
Member States shal ensure that editorial content providers` possibilities to contest decisions of online platforms or to seek judicial
redress in accordance with the laws of the Member State concerned is unaffected.
9
CA 11
Amendment 68 Rapp
Amendment 321
Hannes Heide, Petra Kammerevert
Proposal for a regulation
Article 13 a (new)
Traceability of business customers
A provider of intermediary services shal ensure that business customers can only use its services to promote messages on or to offer
products, content or services to consumers located in the Union if, prior to the use of its services, the provider of intermediary services
can be identified by the fol owing information:
(a)
the name, address, telephone number and electronic mail address of the business customer;
(b)
a copy of the identification document of the business customer or any other electronic identification as defined by Article 3 of
Regulation (EU) No 910/2014 of the European Parliament and of the Council;
(c)
the bank account details of the business customer, where the business customer is a natural person;
(d)
the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3,
paragraph 13 and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council or any relevant act of Union law;
(e)
where the business customer is registered in a corporate or trade register or similar public register, the register in which the
business customer is registered and its registration number or equivalent means of identification in that register;
(f)
a self-certification by the business customer committing to only offer products or services that comply with the applicable rules
of Union law
The provider of intermediary services shal , upon receiving that information, make reasonable efforts to assess whether the information
referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any publicly accessible official online database or online
interface made available by a Member States or the Union or through requests to the business customer to provide supporting
documents from reliable and independent sources.
10
Where the provider of intermediary services obtains indications, including through a notification by law enforcement agencies or other
individuals with a legitimate interest, that any item of information referred to in paragraph 1 obtained from the business customer
concerned is inaccurate, misleading, or incomplete, or otherwise invalid, that provider of an intermediary service shal request the
business customer to correct the information in so far as necessary to ensure that al information is accurate and complete, without
delay or within the time period set by Union and national law. Where the business customer fails to correct or complete that
information, the provider of intermediary services shal suspend the provision of its service to the business customer until the request
is complied with.
The provider of intermediary services shal store the information obtained pursuant to paragraph 1 and 2 in a secure manner for a
period of two years fol owing the termination of their contractual relationship with the business customer concerned. They shal
subsequently delete the information.
Providers of intermediary services shal apply the identification and verification measures not only to new business customers but they
shal also update the information they hold on existing business customers on a risk-sensitive basis, and at least once a year, or when
the relevant circumstances of a business customer change.
Without prejudice to paragraph 2, the provider of intermediary services shal disclose the information to third parties where so required
in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent
authorities or the Commission for the performance of their tasks under this Regulation, as wel as pursuant to proceedings initiated
under other relevant provisions of Union or national law.
The provider of intermediary services shal make the information referred to in points (a),(d), (e) and (f) of paragraph 1 available to the
recipients of the service, In a clear, easily accessible and comprehensible manner.
The provider of intermediary services shal design and organise its online interface in a way that enables business customers to comply
with their obligations regarding pre-contractual information and product safety information under applicable Union law.
The Digital Services Coordinator of establishment shal determine dissuasive financial penalties for non-compliance with any provision
of this Article.
11
CA 12
Amendment 338
Irena Joveva
Amendment 339
Marcel Kolaja
Amendment 340 Kolaja
Proposal for a regulation
Article 14 – paragraph 5
The provider shall also, without undue delay, notify that individual or entity
whose content was removed or chal enged of its action in
respect of the information to which the notice relates, providing information on the redress possibilities in respect of that action,
including the opportunity to reply, unless this would obstruct the prevention and prosecution of serious criminal offences.. The
provider shal ensure that decision-making process is reviewed and final possible action is taken by a qualified staff regardless of the
measures used;
12
CA 13
Amendment 72 Rapp
Amendment 342
Sabine Verheyen
Amendment 343
Laurence Farreng
Proposal for a regulation
Article 14 – paragraph 6 a (new)
When the provider of hosting services decides to remove or disable il egal information provided by the recipient of the service, the
provider shal also prevent the reappearance of that information. This order may also extend to specific information that are identical
to the notified information or to equivalent information which remains essential y unchanged to the information previously notified
and removed or to which access was disabled. The application of this requirement must not lead to any general monitoring obligation.
CA 14
Amendment 359
Marcel Kolaja
Amendment 360
Irena Joveva
13
Proposal for a regulation
Article 17 – paragraph 5
Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means but
have adequate human oversight and are reviewed by qualified staff to whom adequate initial and ongoing training on the applicable
legislation are to be provided, including, where relevant, professional support, qualified psychological assistance and legal advice.
CA 15
Amendment 383
Marcel Kolaja
Amendment 74 Rapp
Amendment 382
Irena Joveva
Proposal for a regulation
Article 19 – paragraph 5
Where an online platform has information indicating that a trusted flagger submitted a significant number of
wrongful,
insufficiently
precise or inadequately substantiated notices
or notices regarding legal content through the mechanisms referred to in Article 14,
including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred
to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to
the entity concerned
and inform the Board and other Digital Services Coordinators,, providing the necessary explanations and supporting
documents.
14
CA 16 by S+D, Renew, Greens
Amendment 75 Rapp
Amendment 386
Irena Joveva
Amendment 387
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Amendment 385
Marcel Kolaja
Proposal for a regulation
Article 20 - paragraph 1
Online platforms shall
restrict, for a reasonable period of time and after having issued a prior warning
and
demonstrated the il egality of the content, the provision
or some features of their services, to recipients of
the service that frequently provide illegal content.
15
CA 16a
Amendment 75 Rapp
Amendment 386
Irena Joveva
Amendment 387
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Proposal for a regulation
Article 20 – paragraph 1
Online platforms shall suspend,
or otherwise restrict, for a reasonable period of time and after having issued a prior warning, the provision
of their services to recipients of the service that frequently provide
or disseminate illegal content.
In cases of repeated suspension,
providers of hosting services shal terminate the provision of their services and, where technical y possible, introduce mechanisms that
prevent the re-registration of recipients of service that frequently provide or disseminate il egal content.
16
CA 17
Amendment 75 Rapp
Amendment 388
Marcel Kolaja
Amendment 389
Irena Joveva
Amendment 390
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Proposal for a regulation
Article 20 – paragraph 2
Online platforms
may suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and
complaints submitted through the notice and action mechanisms
, internal complaints-handling systems
and out-of-court dispute
settlement bodies referred to in Articles 14
, 17 and 18, respectively, by individuals or entities or by complainants that
frequently or
repeatedly submit notices or complaints
or initiate dispute settlements that are unfounded.
17
CA 18
Amendment 83 Rapp
Amendment 415
Petra Kammerevert, Christel Schaldemose
Amendment 416
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Amendment 417
Marcel Kolaja
Proposal for a regulation
Article 26 – paragraph 1 – point b
any negative effects for the exercise of the fundamental rights,
including to
the respect for
human dignity, private and family life, freedom
of expression and information
including the freedom and pluralism of the media, freedom of the arts and sciences, and the right to
education, the prohibition of discrimination and the rights of the child, as enshrined in the Charter respectively;
18
CA 19
Amendment 92 Rapp
Amendment 435
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Amendment 436
Laurence Farreng
Amendment 437
Irena Joveva
Amendment 440
Marcel Kolaja
Amendment 442
Marcel Kolaja
Amendment 443
Marcel Kolaja
Proposal for a regulation
Article 29 – paragraph 1
The parameters used in recommender systems shal always be set up in such a way that it reduces any potential bias and that they are
non-discriminatory and adaptable. Online platforms that use recommender systems shall set out
separately, in their terms and conditions
and on a designated easily accessible webpage the information concerning the role and functioning of recommender systems,
in a
manner which is clear, accessible and easily comprehensible
for al , the main parameters used in their recommender systems, as well as
to offer control with the available options for the recipients of the service to modify or influence those parameters that they may have
19
made available, including
options which
are not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
Online
platforms shal ensure that the option activated by default for the recipient of the service is not based on profiling.
In addition to the obligations applicable to al online platforms, very large online platforms may offer to the recipients of the service
the choice of using recommender systems from third party providers, where available. In these cases, third parties shal be offered
access to the same operating system, hardware or software features that are available or used in the provision by the platform of its
own recommender systems. Any processing of personal data related to those activities shal comply with Regulation (EU) 2016/679,
in particular Articles 6(1)(a) and 5(1)(c).
CA 20
Amendment 94 Rapp
Amendment 450
Marcel Kolaja
Proposal for a regulation
Article 31 – paragraph 2
With regard to moderation and recommender systems, very large online platforms shal provide the Digital Services Coordinator and/or
the Commission upon request
access to algorithms by providing the relevant source code and associated data that al ow the detection
of possible biases. When disclosing these data, very large online platforms shal have a duty of explainability and ensure close
cooperation with the Digital Services Coordinator or the Commission to make moderation and recommender systems ful y
understandable. When a bias is detected, very large online platforms should correct it expeditiously fol owing requirements from the
Digital Services Coordinator or the Commission
. Very large online platforms
shal be able to demonstrate their compliance at every step
of the process pursuant to this Article.
20
CA 21
Amendment 101 Rapp
Amendment 452
Dace Melbārde
Article 31 a (new)
Article 31a
1.
Upon request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shal ,
within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in
paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of
systemic risks as set out in Article 26(1).
2.
In order to be vetted, researchers shal be affiliated with academic institutions, media, civil society or international
organisations representing the public interest,
be independent from commercial interests, have proven records of expertise in the
fields related to the risks investigated or related research methodologies, and shal commit and be in a capacity to preserve the
specific data security and confidentiality requirements corresponding to each request.
3.
Very large online platforms shal provide access to data pursuant to paragraphs 1 and 2 through online databases or
application programming interfaces, as appropriate.
4.
The Commission shal , after consulting the Board, adopt delegated acts laying down the technical conditions under which
very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used.
Those delegated acts shal lay down the specific conditions under which such sharing of data with vetted researchers can take place
in compliance with Regulation(EU) 2016/679, taking into account the rights and interests of the very large online platforms and the
recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining
the security of their service.
5.
Within 15 days fol owing receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request
the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it
21
is unable to give access to the data requested by vetted researchers because one of fol owing two reasons:
(a)
it does not have access to the data;
(b)
giving access to the data wil lead to significant vulnerabilities for the security of its service or the protection of confidential
information, in particular trade secrets, in exceptional circumstances, when justified by an obligation under Article 18 of Directive
(EU) 2020/0359 and Article 32(1)(c) of Regulation (EU) 2016/679.
6.
Requests for amendment pursuant to point (b) of paragraph 5 shal contain proposals for one or more alternative means
through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of
the request.
The Digital Services Coordinator of establishment or the Commission shal decide upon the request for amendment within 15 days
and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period
to comply with the request.
CA 22
Amendment 109 Rapp
Amendment 478
François-Xavier Bel amy
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service
, as wel as other parties having a legitimate interest and meeting relevant criteria of expertise and
independence from any intermediary service provider shall have the right to lodge a complaint against providers of intermediary services
alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is
established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services
22
Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State,
the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
CA 23
Amendment 118
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Amendment 119
Irena Joveva
Amendment 120
Victor Negrescu
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment
and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the
European Union (‘Charter’)
. These rights include, among others, the right to freedom of expression and information
, freedom and pluralism
of media, the right privacy and to protection of personal data, the freedom to conduct a business
, the right to human dignity, the rights of
the child, the right to protection of property, including intel ectual property, and the right to non-discrimination.
23
CA 24
Amendment 125
Sabine Verheyen
Amendment 1 Rapp
Amendment 126
Petra Kammerevert, Christel Schaldemose
Amendment 127
Marcel Kolaja
Amendment 128
Laurence Farreng
Amendment 129
Petra Kammerevert, Christel Schaldemose
Amendment 131
Petra Kammerevert, Christel Schaldemose
Proposal for a regulation
Recital 9
(9)
Respecting the Union’s subsidiary competence to take cultural aspects into account in its action according to Article 167,
Paragraph 4 of the Treaty, this Regulation should not affect Member States’ competences in their respective cultural policies, in particular
national measures addressed to intermediary service providers in order to protect the freedom of expression and information, media
freedom and to foster media pluralism as wel as cultural and linguistic diversity. This Regulation should complement, yet not affect the
24
application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular
Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament
and of the Council as amended,28 and Regulation (EU) …/. of the European Parliament and of the Council29 – proposed Terrorist Content
Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally
applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply
while not affecting the Member
States’ competences to adopt and further develop laws, regulations and other
measures in order to secure and promote the freedom of
expression and information in the media, promoting press freedom in line with the Charter of fundamental rights as wel as
cultural and
linguistic diversity. Where those acts leave Member States the possibility of adopting certain measures at national level
, this possibility
should remain unaffected by this Regulation, in particular their right to adopt stricter measures.
In the event of a conflict between this
Regulation and Directive 2010/13/EU as amended as wel as national legislation taken in accordance with the Directive, Directive
2010/13/EU should prevail.
CA 25
Amendment 2 Rapp
Amendment 133
Victor Negrescu
Amendment 134
Laurence Farreng
Amendment 135
Marcel Kolaja
Proposal for a regulation
Recital 11
(11)
It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights,
in particular
Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market, as implemented in national laws, which establish
specific rules and procedures that should remain unaffected.
As a whole, the Regulation must ensure legal certainty for platforms and
25
safeguard the fundamental rights of users. No provision in this Regulation should lead to less favourable solutions to guarantee a high
level of protection of copyright and related rights that the one prevailing before its entry into force or after in the Union’s and its Member
States’ positive law relating to the protection of literary and artistic property.
CA 26
Amendment 3 Rapp
Amendment 142
Irena Joveva
Amendment 143
Martina Michels, Alexis Georgoulis
Amendment 144
Victor Negrescu
Proposal for a regulation
Recital 12
(12)
Currently, the definitions of il egal content varies based on national law, and ambiguous definitions of this term in the Regulation
would create an unpredictable regulatory environment for al digital service providers in Europe. Without a clear definition, digital service
providers and intermediaries wil be held to opaque and unreasonable standards. Confusion about what constitutes il egal content could
lead service providers and intermediaries to wrongful y restrict some types of content, which would harm fundamental rights such as the
freedom of expression and opinion. In order to achieve
therefore the objective of ensuring a safe, predictable and trusted online environment,
for the purpose of this Regulation
, the concept of “illegal content” should
underpin the general idea that what is il egal offline should also
be il egal online, and what is legal offline should also be legal online. The concept should also be defined
appropriately to cover information
relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information,
irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful
discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-
26
consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use
or il egal
dissemination of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial
whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the
precise nature or subject matter is of the law in question.
CA 27
Amendment 4 Rapp
Amendment 145
Irena Joveva
Amendment 146
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Proposal for a regulation
Recital 13
(13)
Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject
to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this
Regulation, the subcategory of online platforms. Online platforms, such as social networks,
search engines,
content-sharing platforms or
online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the
service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing
overly broad obligations, providers of hosting services should not be considered as online platforms
for the purposes of this Regulation, where
the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical
reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the
rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a
feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the
publisher
.
27
CA 28
Amendment 5 Rapp
Amendment 152
Petra Kammerevert, Christel Schaldemose
Amendment 153
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Amendment 154
Irena Joveva
Proposal for a regulation
Recital 18
(18)
The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services
neutrally, by a merely technical
, automatic
and passive processing of the information provided by the recipient of the service, the provider of
intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should
accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of
intermediary service itself, including
but not limited to, where the provider optimises, promotes or moderates content, beyond offering basic
search and indexing functionalities that are absolutely necessary to navigate the content,
or incites users to upload content, irrespective of
whether the process is automated, where the information has been developed under the editorial responsibility of that provider.
28
CA 29
Amendment 6 Rapp
Amendment 155
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Proposal for a regulation
Recital 18 a (new)
(18a) Directive 2000/31/EC states that the exemptions from liability cover only cases where the activity of the information society service
provider is limited to the technical process of operating and giving access to a communication network over which information made
available by third parties is transmitted or temporarily stored, for the sole purpose of making the transmission more efficient. This activity
is of a mere technical, automatic and passive nature, which implies that the information society service provider has neither knowledge of,
nor control over, the information which is transmitted or stored. This implies that al active services are excluded from the limited liability
regime. In that context, those exemptions should also not be given to providers of intermediary services that do not comply with the due
diligence obligations of this Regulation.
29
CA 30
Amendment 8 Rapp
Amendment 156
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Amendment 157
Petra Kammerevert, Christel Schaldemose
Proposal for a regulation
Recital 22
(22)
In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or
awareness of illegal content, act expeditiously to remove or to disable access to that content.
In order to ensure harmonised implementation
of il egal content removal throughout the Union, the provider should without delay, remove or disable access to said il egal content. In
practice,
such an order to remove il egal content could also effectively address the reappearance of this il egal content . If a hosting service
provider is ordered by an administrative or judicial authority to prevent infringements, such an order should in principle be limited to a
specific infringement and to specific parts of the service, but may be extended to al copies of that specific content, to efficiently ensure that
the infringing content does not reappear, taking into account the potential harm the il egal content in question may create. The prevention
of the reappearance of il egal content should under no circumstances give rise to a general monitoring obligation or an obligation for the
provider to carry out investigations without a specific reason, and safeguards must be established so that stay-down measures never lead
to any unavailability of legal content. A general monitoring obligation should be assumed if a hosting service provider is obliged to screen
an unspecified amount of information provided by a recipient of the service in order to prevent a specific infringement of the applicable
law.
The removal or disabling of access should be undertaken
with due respect of al relevant principles enshrined in the Charter of
Fundamental Rights, including the freedom of expression. The provider can obtain actual knowledge or awareness through its own-
30
initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are
sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate
act against the allegedly illegal content.
CA 31
Amendment 12 Rapp
Amendment 162
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Amendment 163
Victor Negrescu
Proposal for a regulation
Recital 26
Whilst the rules in Chapter I of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important
to recall the generally important role played by those providers.
In many cases, such providers may be the best placed to address the problem
of illegal content
and activities by removing or limiting access to such content
or by bringing such activities to an end. Recipients of the
service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content
that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed
online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with
the applicable law. Furthermore, where it is
appropriate to involve information society services providers, including providers of intermediary
services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational
ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and
accessibility of information that is not illegal content.
31
CA 32
Amendment 168
Petra Kammerevert, Christel Schaldemose
Amendement 264
Sabine Verheyen
Amendment 20 Rapp
Proposal for a regulation
Recital 28 a (new)
(28 a) Since editorial content providers hold editorial responsibility for the content and services they make available, a presumption of
legality should exist in relation to the content provided by those providers who carry out their activities in respect of European values and
fundamental rights. Such content and services should benefit from a specific regime that prevents a multiple control of those content and
services. Those content and services should be offered in accordance with professional and journalistic standards as wel as legislation and
are already subject to systems of supervision and control, often enshrined in commonly accepted self-regulatory standards and codes. In
addition, they usual y have in place complaints handling mechanisms to resolve content-related disputes. Editorial responsibility means the
exercise of effective control both over the selection of content and over its provision by means of its presentation, composition and
organisation. Editorial responsibility does not necessarily imply any legal liability under national law for the content or the services
provided. In any case, any provider of an audiovisual media service within the meaning of Article 1, paragraph 1a of Directive 2010/13/EC
and publishers of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790 should be considered as editorial content
providers for the purposes of this Regulation. Intermediary service providers should refrain from removing, suspending or disabling access
to any such content or services, and should be exempt from liability for such content and services. Compliance by editorial content providers
32
with these rules and regulations should be overseen by the respective independent regulatory authorities, bodies or both and the respective
European networks they are organised in.
CA 33
Amendment 19 Rapp
Amendment 179
Sabine Verheyen
Amendment 178
Marcel Kolaja
Amendment 180
Irena Joveva
Proposal for a regulation
Recital 38
(38)
Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain
rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the
protection of the
rights of the recipients of the service and the avoidance of unfair or arbitrary outcomes.
Terms and conditions should be
summarised in a clear, accessible and easily comprehensible manner while offering possibility of opting-out from optional clauses.
Intermediary service providers should be prohibited from drawing up terms and conditions that go against European and national laws
and that lead to the removal, disabling of access to other kind of interferences with content and services of editorial content providers. The
freedom and pluralism of media should be respected. To this end, Member States should ensure that editorial content providers `
possibilities to contest decisions of online platforms or to seek judicial redress in accordance with the laws of the Member State concerned
remain unaffected.
33
CA 34
Amendment 22 Rapp
Amendment 187
Laurence Farreng
Proposal for a regulation
Recital 40
(40)
Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by
and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important
that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the
notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned
('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access
to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple
specific items of allegedly illegal content through a single notice
in order to ensure the effective operation of notice and action mechanisms.
The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting
services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Moreover, the
notification and action mechanism should be supplemented by actions aimed to prevent the reappearance of content which has been
identified as il egal or which is identical to content which had been identified and withdrawn as il egal. The application of this requirement
must in no way lead to a general monitoring obligation.
34
CA 35
Amendment 23 Rapp
Amendment 191
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Amendment 192
Irena Joveva
Proposal for a regulation
Recital 42
(42)
Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following
receipt of a notice or acting on its own initiative, including through the use of
efficient, proportionate and accurate automated means
accompanied by human oversight,
the provider should inform the recipient of its decision, the reasons for its decision and the available
effective redress possibilities to
rapidly contest the decision, in view of the negative consequences that such decisions may have for the
recipient, including as regards the exercise of its fundamental right to freedom of expression. Available recourses to challenge the decision of
the hosting service provider should always include judicial redress.
35
CA 36
Amendment 24 Rapp
Amendment 199
François-Xavier Bel amy
Amendment 200
Irena Joveva
Amendment 204
Martina Michels, Alexis Georgoulis
Proposal for a regulation
Recital 46
(46)
Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure
that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority,
without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and
objective manner. Such trusted flagger status should only be awarded to entities, that have demonstrated, among other things, that they have
particular expertise and competence in tackling illegal content,
have significant legitimate interests
and have demonstrated competence for
the purposes of detecting, identifying and notifying il egal content, and that they work in a diligent and objective manner. Such entities can
also be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union
Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the
organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal
racist and xenophobic expressions online. For intellectual property rights,
right holders‘ representatives and organisations of industry could
be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions,
including their competence and
objectivity. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving
similar treatment
to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise
cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European
Parliament and of the Council.43
36
CA 37
Amendment 25 Rapp
Amendment 206
Marcel Kolaja
Amendment 207
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Proposal for a regulation
Recital 47
(47)
The misuse of services of online platforms by
repeatedly providing
or disseminating illegal content or by
repeatedly submitting
manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines
trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and
proportionate safeguards against such misuse. Information should be considered to be illegal content and notices or complaints should be
considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively
that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities
in respect of the person engaged in abusive behaviour. Redress should always be open to the decisions taken in this regard by online platforms
and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not
prevent online platforms from taking other measures to address misuse of their services, in accordance with the applicable Union and national
law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in
Union or national law.
37
CA 38
Amendment 30 Rapp
Amendment 213
Irena Joveva
Amendment 217
Victor Negrescu
Proposal for a regulation
Recital 52
(52)
. Advertising funding helps to ensure that European citizens can enjoy news and entertainment services for free or at a reduced rate.
Without effective advertising, funding for al sorts of media would be greatly reduced, which may lead to more expensive TV-subscriptions,
reduced newspapers and magazines’ plurality and independence, and some radio stations would lack the ability to provide news and
entertainment throughout the day, to the detriment of media pluralism and cultural diversity. Advertising is a key source of growth for a
number of audiovisual media service providers, press publishers and radio stations. The use of data, in ful compliance with the obligations
set out in the Regulation (EU) 2016/679 and Directive 2002/58/EC, is a way to improve the effectiveness of advertising. It is therefore
important for this regulation to focus on delivering more advertising transparency while not negatively affecting the effectiveness of
advertising for news and entertainment services.
However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to
financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory
display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from
Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain
individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients
of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them,
providing meaningful explanations of the logic used to that end, including when this is based on profiling
, and opt for less intrusive forms of
advertising that do not require any tracking of user interaction with content. The requirements of this Regulation on the provision of
38
information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in
particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain
consent of the data subject prior to the processing of personal data for
advertising. Additional y, online platforms should provide recipients
of the service to whom they supply online advertising
, when requested and to the extent possible, with information that al ows recipients
of the service to understand how data was processed, categories of data or criteria on the basis of which ads may appear, and data that
was disclosed to advertisers or third parties, and refrain from using any aggregated or non-aggregated data, which may include anonymised
and personal data without explicit data subject’s consent. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC
in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
CA 39
Amendment 31 Rapp
Amendment 220
Irena Joveva
Amendment 222
Laurence Farreng
Amendment 223
Ibán García Del Blanco, Marcos Ros Sempere, Domènec Ruiz Devesa
Proposal for a regulation
Recital 57
Three categories of risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through
the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal
activities, such as the sale of products
, or services prohibited by Union or national law, including counterfeit products
or the il egal display of
copyright protected content. For example, and without prejudice to the personal responsibility of the recipient of the service of very large
online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant
risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact
39
of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression
and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may
be incorporated in the
basic programming of the
algorithms used by the very large online platform or
arise from the misuse of their service through the submission
of abusive notices or other methods for silencing speech hampering competition,
or the way platforms' terms and conditions including
content moderation policies are enforced. Therefore, it is necessary to promote adequate changes in platforms' conduct, a more
accountable information ecosystem, enhanced fact-checking capabilities and col ective knowledge on disinformation, and the use of new
technologies in order to improve the way information is produced and disseminated online. A third category of risks concerns the intentional
and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes,
public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and
deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated
or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or
incompatible with an online platform’s terms and conditions.
CA 40
Amendment 33 Rapp
Amendment 229
Petra Kammerevert, Christel Schaldemose
Proposal for a regulation
Recital 58
(58)
Very large online platforms should deploy the necessary means to diligently mitigate the risks identified in the risk assessment. Very
large online platforms
should under such mitigating measures
enhance or otherwise
adapt the design and functioning of their content
moderation, algorithmic recommender systems and online interfaces, so that they limit the dissemination of illegal content
, for instance by
building in systems to demote content identified as harmful, introducing artificial delays to limit virality, adapting their decision-making
processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for
specific content, or other actions, such as improving the visibility of authoritative information sources
such as public interest information
40
provided by public authorities or international organisations or content under a editorial content provider’s control and subject to specific
standards, sector- specific regulation and oversight. Very large online platforms may reinforce their internal processes or supervision of any
of their activities, in particular as regards the detection of systemic risks. They
should also initiate or increase cooperation with trusted flaggers,
organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating
or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements
of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order,
protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online
platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of
potential
negative effects on the fundamental rights of the recipients of the service.
Mitigation of risks, which would lead to removal, disabling access
to or otherwise interfering with content and services for which a editorial content provider holds editorial responsibility, should not be
considered reasonable or proportionate.
CA 41
Amendment 39 Rapp
Amendment 244
Irena Joveva
Proposal for a regulation
Recital 68
It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures
concerning specific types of illegal
or harmful content should be explored via self- and co-regulatory agreements. Another area for
consideration is the possible negative impacts of risks on society, such as coordinated operations aimed at amplifying information,
for instance
through the use of bots
, fake accounts
and proxy services for the creation
and propagation of fake or misleading information, sometimes
with a purpose of obtaining economic
or political gain, which are particularly harmful for vulnerable recipients of the service.,
Other Areas
for consideration could be to improve transparency regarding the origin of information and the way it is produced, sponsored, disseminated
and targeted, to promote diversity of information through support of high quality journalism and relation between information creators
41
and distributors, and to foster credibility of information by providing an indication of its trustworthiness, and improving traceability of
information of influential information providers, whilst respecting confidentiality of journalistic sources. In relation to such areas, adherence
to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure.
The refusal by an online platform of the Commission’s invitation to participate in the application of such a code of conduct
must be taken into
account, when determining whether the online platform has infringed the obligations laid down by this Regulation.
When codes of conducts
are used as a risk mitigating measure, they should be binding for very large online platforms, subjected to an oversight by the Digital Service
Coordinator.
42