Brussels, 18 July 2022
Interinstitutional files:
2022/0155 (COD)
WK 10235/2022 ADD 1
LIMITE
JAI
FREMP
ENFOPOL
TELECOM
CRIMORG
COMPET
IXIM
MI
DATAPROTECT
CONSOM
CYBER
DIGIT
COPEN
CODEC
This is a paper intended for a specific community of recipients. Handling and
further distribution are under the sole responsibility of community members.
MEETING DOCUMENT
From:
General Secretariat of the Council
To:
Law Enforcement Working Party (Police)
N° prev. doc.:
9068/22
Subject:
Proposal for a Regulation of the European Parliament and of the Council laying
down rules to prevent and combat child sexual abuse
- comments from delegations on Articles 1 to 7
Delegations will find attached the compilation of comments received from Members States on the above-
mentioned proposal following the meeting of the LEWP (Police) on 5 July 2022.
WK 10235/2022 ADD 1
LIMITE
EN
link to page 3 link to page 5 link to page 7 link to page 8 link to page 9 link to page 14 link to page 17 link to page 19 link to page 45 link to page 47 link to page 48 link to page 50 link to page 51 link to page 53 link to page 56 link to page 58
Written comments submitted by Member States
Proposal for a Regulation laying down rules to prevent and combat
child sexual abuse
(9068/22)
Contents
AUSTRIA ............................................................................................................................................ 2
BELGIUM ........................................................................................................................................... 4
CROATIA ............................................................................................................................................ 6
DENMARK ......................................................................................................................................... 7
FRANCE .............................................................................................................................................. 8
GERMANY ....................................................................................................................................... 13
GREECE ............................................................................................................................................ 16
HUNGARY ........................................................................................................................................ 18
IRELAND .......................................................................................................................................... 44
ITALY ................................................................................................................................................ 46
LATVIA ............................................................................................................................................. 47
LITHUANIA ...................................................................................................................................... 49
MALTA ............................................................................................................................................. 50
THE NETHERLANDS ...................................................................................................................... 52
PORTUGAL ...................................................................................................................................... 55
SPAIN ................................................................................................................................................ 57
1
AUSTRIA
Austria communicates to you its written comments on Articles 1 to 7 of document ST 9068/22
(CSA proposal) and a question of general nature:
The question of general nature:
Is a direct transmission of suspicious reports from US ISPs via the US National Centre for Missing
and Exploited Children (NCMEC) directly to the EU MS then still admissible?
Or will the report have to go through the EU centre in the future?
If this is the case:
- How can it then be ensured that there is no loss of time? This would be particularly fatal in the
case of reports of hitherto unknown abuse material - since it can be assumed here that the victim is
still in the custody of the perpetrator.
- Can US internet service providers offering their services in the EU continue to scan their content
using the indicators available to them and report suspicious activity reports to EU law enforcement
authorities?
Or can these now only be communicated to EU authorities if there is a specific detection order?
The comments on Articles 1 to 7:
Chapter I
Article 1:
Austria has a scrutiny reservation. Austria will examine the limits of the scope of application in the
context of the discussion of the other chapters of the draft and will submit further comments and
proposals for amendments to Article 1 at a later date.
On para. 4:
Recital 9 states that the provisions of the proposal provide for exemptions from the requirements of
confidentiality of communications and traffic data under Articles 5 and 6 of the e-Privacy Directive,
in accordance with Article 15(1) of the e-Privacy Directive. Recital 9 applies this provision by
analogy, as Article 15(1) exclusively empowers Member States to adopt national regulations. The
analogous application of Article 15(1) of the e-Privacy Directive appears questionable for the
following reasons: Recital 11 of the e-Privacy Directive states with regard to Article 15(1) that the
Directive does not apply to areas that are not covered by Community law. The competence of the
member states to enact their own regulations in the areas of public security, national defence and
state security as well as for the enforcement of criminal law provisions therefore remains unaffected
as long as they are designed in conformity with fundamental rights. The aforementioned areas of
law fall predominantly, if not exclusively, within the regulatory competence of the Member States.
This leads to the conclusion that Article 15(1) of the e-Privacy Directive therefore exclusively
"empowers" the Member States to enact such regulations, because only they are competent to enact
regulations in these areas. It is therefore fundamentally doubted that Article 15(1) of the e-Privacy
Directive can be analogised to the extent that a regulatory competence of the EU can be derived
from it, because there is no unintended gap here. Even if Article 15(1) of the e-Privacy Directive
could be applied by analogy, the fact that Article 15(1) only mentions measures of internal and
external security stands in the way of its use in this specific case. As far as can be seen, it has been
assumed so far that the present draft regulation is not intended to deal with law enforcement
2
measures, but with the harmonisation of the internal market. In this case, the draft would not be a
permissible exception to Articles 5 and 6 of the e-Privacy Directive according to Article 15(1) e-
Privacy Directive. The Presidency and the EC are therefore requested to clarify whether the present
draft is a law enforcement measure within the meaning of Article 15(1) of the e-Privacy Directive.
Furthermore, it is stated that the proposed measures for monitoring and prior checking of the
content of users of internet services without concrete grounds for suspicion and without
differentiation is not proportionate in the sense of Article 15(1) of the e-Privacy Directive.
Accordingly, there are also massive fundamental rights concerns, in particular with regard to a
violation of the right to privacy under Article 7 CFR and the right to data protection under Article 8
CFR. Fundamental rights concerns exist regarding the de facto elimination of all possibilities to use
end-to-end encryption of communications in messenger or chat services. The technical
implementation of content access to electronic communication is not directly determined by the
draft, but in fact content access can only take place through a fundamental breach of secure end-to-
end encryption. In addition, the error rate in automatic content recognition is also problematically
high. How should this be seen in the context of the required IT security and in relation to secure and
confidential communication?
Article 2:
Austria has a scrutiny reservation. Austria will review the definitions during the examination of the
other chapters of the draft and will make further remarks and amendment proposals for Article 2 at
a later time.
Concerning lit. u:
Is it conceivable that a service provider with its main place of business in a third country has a
representative office in several EU countries? What is the procedure then?
What about internet service providers that do not have a branch or representation in the EU? Will
they have to actively exclude EU citizens from using their services in the future?
Chapter II
Article 4 line 3:
Communication service providers should provide young users with "options to prevent grooming".
What measures should be considered here?
Article 7 line 9:
With this provision, the application period of the detection order is limited to 24 months in case of a
risk according to § 207a of the Austrian Criminal Code and to 12 months in case of a risk of
grooming.
Is a new risk analysis mandatory thereafter? Is there an accelerated procedure for extending the
order in the case of an existing risk?
3
BELGIUM
At this moment we would like to uphold a general scrutiny reservation. We will study the answers
given by the Commission during the last meeting on 5 July, but wanted to confirm some other
questions we have:
- We welcome the proposed online seminar in October on the possibilities related to detection
technologies and end-to-end encryption. We note in Article 7(4), 1st subparagraph, (b) the
condition that
“the reasons for issuing the detection order outweigh negative consequences
for the rights and legitimate interests of all parties affected, having regard in particular to
the need to ensure a fair balance between the fundamental rights of those parties”. Elsewhere, in Article 16(4)(d) with blocking orders, we read that those fundamental rights
can contain also the provider’s freedom to conduct a business. We noted that the
Commission mentioned that “if detection technologies without lowering the privacy level do
not exist, detection will not be ordered” but also that those technologies do exist. We hope to
gain more insight in the achievable practical end results if a provider is thus arguing that all
available solutions provide a lower privacy level than end-to-end encryption (which the
study in annex 9 of the impact assessment, especially the table on page 309, clearly
confirms), which poses problems for not only the users but also the provider’s legitimate
interests and freedoms, and that this thus should preclude an obligation to use those
solutions.
- We welcome the announced documents which will shed a light on the comparison with the
Digital Services Act and the Terrorist Content Online Regulation. We note that the
Commission informed us that Article 8 of the Digital Services Act would not be considered
applicable here in the CSA Regulation. However, for example, the minimum requirements
related to the notice and action mechanism on the other hand are to be considered
applicable. We wonder whether it would not be better to include more concrete references to
the DSA in all the relevant Articles. At the very least it seems that it should be mentioned in
the text that Article 8 is not applicable as one could easily understand recitals 7 and 8 to
imply that this Article 8 concerns an issue that is not or not fully addressed in the CSA
Regulation.
- The Commission confirmed that voluntary referral by hotlines to the providers is still
possible. Voluntary notifications by the Member States (Article 32) and the EU center
(Article 49) are explicitly mentioned in the text. Would it not be advisable to take up a
similar reference for the hotlines, for example in the recital 70?
- We understand that Article 2(a) and Article 2(g) refer to the definitions of the DSA. How
will this however in practice be streamlined with the slightly different definitions in the
TCO Regulation? How will be prevented that different understanding arises concerning
which providers are meant and how EU legislation applies to them?
- Would it be advisable to look into the use of the word ‘recipient of the service’ instead of
‘user’ in Article 2(h)? Of course, we would then have to be sure that ‘recipient of the
service’ also includes for example a child using the phone of a parent.
- Does Article 6(2) imply that the providers of software application stores would receive the
content of the risk assessment of the providers of the apps?
4
- The reference “
competent judicial authority of the Member State that designated it” in
Article 7(1) seems unduly complicated. Could it be clarified, maintaining the objective to
target the competent authority of the member state of establishment? For example
“(…)
shall have the power to request the competent judicial authority of its/that Member State
(…)”?
- In relation to Article 7(4) we wonder if elements of proof (that the detection order is thus
necessary and the conditions are fulfilled) can also be transmitted by the competent
authorities of other Member States. This is not clear to us in the text.
- Based on the phrase “
it is likely, despite any mitigation measures that the provider may have
taken or will take, that the service is used,(…)” in Article 7(5)(a), we would like to know
whether a detection order can be issued without first having to resort to sanctioning a lack of
mitigating measures. So, if a provider is not complying with its obligation (based on Article
4) to issue mitigating measures, can we immediately issue a detection order or do we first
have to issue a sanction in relation to Article 4 before being able to issue a detection order?
- In Article 7(5), (6) and (7) it reads
“the service is used, to an appreciable extent” for
CSAM
. This seems to be a very vague notion. On what criteria will this be examined? Could
this be specified in the recitals?
- We also would like to understand better the impact of the proposed age of 17 years (or 18
years as was mentioned by several Member States) in relation to grooming. If a Member
State would issue a detection order, would/could this then – taking into account its own legal
framework – be limited still to the age limit in use in that specific Member State? Or
would/should this then be always this standardized 17 years (or 18 years), which would
mean that a detection order is issued for something that is not illegal in that jurisdiction –
which could create possible difficulties with regard to the derogation of the ePrivacy
Directive it seems.
5
CROATIA
Child users - the proposal refers to persons under the age of 17 when soliciting sexual
activity on the Internet, which is different from the national age limit in the Republic of
Croatia.
The process of creating a risk assessment should be additionally sketched or more precisely
explained by the responsibilities that the national coordinating bodies will have. We join the
request for an overview of the functioning and tasks of national coordination bodies.
6
DENMARK
We are reviewing the proposal for the regulation laying down rules to prevent and combat sexual
abuse (the CSA) with great interest and we are looking forward to cooperating with you on the file.
We understand that the negotiations are still at an early stage. However, we would like already at
this point to raise our concerns relating to Article 28 (1) (c) and in that regard bring a proposal for
rephrasing the provision as set out below.
”(c) the power to impose fines,
initiate legal proceedings for the imposition of fines, e.g. by courts,
or both, in accordance with national rules and procedures, in accordance with Article 35 […]”
The reason for the prosed rephrasing is, that according to the widely accepted interpretation of
Section 3(3) of the Danish Constitutional Act, Danish administrative authorities cannot, with
binding effect, impose fines or other sanctions characterized as punitive under Danish law. Thus, if
Denmark were to introduce the possibility of applying administrative fines, as prescribed in Article
28 (1) (c), as a means of sanctioning breaches of EU law, it is highly likely that Danish courts –
ultimately the Supreme Court – would strike down such fines as unconstitutional. As a result, it is
not be possible for Danish authorities to impose such fines.
At the same time, an administrative authority cannot request directly a judicial authority to impose a
fine under Danish national law. The administrative authority must refer the matter to the police and
public prosecution service with the purpose of enforcing the law by initiating criminal proceedings
before the Danish courts. The reason for this system is that the public prosecution service is subject
to a number of important procedural guarantees, which strengthen the position of the accused and
the defense.
When the public prosecution service has to assess the issue of prosecution in a criminal case, it does
so on the basis of the principle of objectivity, as provided for in the Danish Administration of
Justice Act, which is a fundamental principle in Danish criminal justice and is considered one of the
most important procedural guarantees of legal certainty. The principle entails that the public
prosecution service and the police are obliged to carry out their investigations in an objective
manner. This means that the public prosecution service must ensure that criminals are brought to
justice, but also that the prosecution of innocents does not take place.
As the current Article does not prescribe for such procedure, we suggest the above stated changes to
Article 28 (1) (c), which we believe still protects the purposes of the current article.
We therefore kindly ask you to amend the article as suggested.
7
FRANCE
General comments
The French authorities would like to begin by welcoming the Commission's proposal and
emphasize that they support the general principles set out in it.
However,
the French authorities will be very careful to preserve certain national mechanisms -
in particular the French platform for harmonization, analysis, cross-checking and guidance of alerts
(PHAROS) - and not to duplicate tools. They point out that the national mechanisms put in place in
some Member States are fully satisfactory and that the balance found must not be undermined in
any way.
In addition, the French authorities emphasize the particular complexity of the mechanisms proposed
by the regulation and suggest injecting a certain amount of
flexibility into them. While the French
authorities understand and support the need to protect the fundamental rights and privacy of
individuals, they believe it is necessary to consider simplifying the proposed tools. For example, the
French authorities do not understand, in the context of a takedown order, the
multiplication of
actors before issuing such an order when the very purpose of takedown is to act quickly to
remove content accessible to all. The French authorities point out that in 2021, the national PHAROS
platform requested more than 118,000 removals of child pornography content.
Furthermore, the French authorities believe that
law enforcement authorities should be more
involved in the implementation of injunctions and in the overall architecture of the regulation.
With respect to private communication services and encrypted content in general, the French
authorities favor finding
solutions that do not weaken encryption and they remain vigilant about
not imposing particular technologies or means on service providers to comply with the new
provisions. Indeed, providers would then be forced to weaken their system to integrate the imposed
technology and thus weaken the encryption mechanisms in place.
Finally, the French authorities wish to emphasize that this proposed regulation must be
implemented in a manner consistent with existing European legislative instruments such as the
TCO (Terrorism Content Online) regulation and the "transversal"
DSA (Digital Services Act)
regulation.
Comments by article
- Article 1 (subject matter and scope of the text): to be consistent with the DSA, it would seem
appropriate to include search engines in the scope of the actors covered by the regulation.
Therefore, Article 1 of the proposed regulation could be completed in this sense. In this context, it
should be stressed that the proposed CSA regulation (child sexual abuse) does not provide for an
obligation to remove content
from search engines. The Commission will have to be questioned on
this point, underlining that this tool is frequently used by the French authorities to reduce the
accessibility of illicit contents to the public.
It is also necessary to
state in the text that the provision does not apply to government
communication systems.
Furthermore, it might be appropriate for article 1 paragraph 3 to make an express reference to the
so-called
"TCO" (terrorism content online) regulation in order to preserve its functioning, as is the
case with other texts within this paragraph.
8
Risk assessment (Chapter 2. Section 1. Articles 3,4,5)
- section 1 (risk assessment and mitigation obligations): this section provides for the obligation
for hosting companies and interpersonal communication services, regardless of their size, to
develop an analysis to assess the risks that their services may be misused for the purpose of sexual
abuse of minors and, on the basis of this analysis, to take measures to mitigate the risks. The DSA
(Digital Services Act) also provides for such obligations (articles 26 and 27 of the final
compromise), but only for very large platforms and search engines. It would therefore be advisable
to ensure that the obligations of the DSA and those of the ASM regulation are properly articulated
for these very large players, who will be subject to both. In this perspective, it could be useful to
add a provision in Chapter II, Section 1 of the ASM Regulation aimed at clarifying this articulation.
- Article 6 sets out obligations for the application stores. They must, if possible together with the
application providers, assess whether a service can be used to solicit children online (the so-called
"
grooming" phenomenon). They must prevent minors from accessing the service if there is a
significant risk, but the question of implementation arises. In addition, there are no specific
measures to verify age or identity, and in the opposite direction, no measures to verify that an adult
would not use children's applications for malicious purposes. However, locking the regulation in too
precise technological processes, corresponding to technological innovations, would not allow it to
evolve at the pace of technology. It is therefore a question of expediency whether to be precise,
which could quickly become outdated, or to take "
reasonable measures" that would allow an
independent authority to adapt to technology in order to prevent and fight against the phenomenon
of child solicitation.
- Injonction de détection (article 10 paragraphe 6)
First of all, we should mention a specific service in France, namely the CNAIP (national center for
the analysis of child pornography images), which centralizes data of a child pornography nature
(photos, videos) and contributes to the detection of illicit content.
- With regard to Article 10(6), the Commission's proposal states that:
"Where a provider detects potential online child sexual abuse through the measures taken to execute
the detection order, it shall inform the users concerned without undue delay, after Europol or the
national law enforcement authority of a Member State that received the report pursuant to Article
48 has confirmed that the information to the users would not interfere with activities for the
prevention, detection, investigation and prosecution of child sexual abuse offences."
However, Article 48 specifies that it is the competent authorities of the Member States - and not
Europol - that have to confirm that the information to the user would interfere with an investigation.
The Commission should therefore provide more information on this difference in wording, which
has serious practical consequences. In any case, the French authorities are
opposed to Europol
being able to confirm that a user's information compromises an ongoing investigation. They
point out that Europol is an agency that supports Member States, that it acts only if two or more
Member States are affected and that it can in no way commit the competent national authorities.
Moreover, the French authorities welcome the mechanism for challenging an injunction, which
offers a highly important guarantee, particularly for users. Indeed, Article 10(5) provides that: "The
provider shall inform users in a clear, prominent and comprehensible way of the following: the
users' right of judicial redress referred to in Article 9(1) and their rights to submit complaints to the
provider through the mechanism
referred to in paragraph 4, point (d) and to the Coordinating
Authority in accordance with Article 34 ».
9
- Article 9(1) provides that "providers of hosting services and providers of interpersonal
communications services that have received a detection order, as well as users affected by the
measures taken to execute it, shall have a right to effective redress.
According to the French authorities, if the possibilities of recourse already existing in domestic law
(6-4 LCEN, law for confidence in the digital economy) for the - withdrawal injunctions - do not
pose particular problems, this right of recourse such as formulated in the context of the detection
injunction does not seem to be sufficiently circumscribed and risks to weaken the current recourse
processes. Indeed, each user will be able to exercise a right of appeal if he is affected by the
measures taken for the detection order. However, the French authorities raise the question of what
the concept of "affected" encompasses, which could, according to them, affect all users of a
platform subject to a detection order. However, the French authorities point out that, given the
number of users affected by a detection order, the judicial authorities risk becoming overloaded and
not being able to study all of the challenges within a respectable period of time and, de facto, risk
further undermining the right to a trial within a reasonable period of time (article 6 paragraph 1
ECHR).
During the "Police" group of June 22, 2022 (LEWP-P), the Commission indicated that the use of AI
would probably be classified as "high risk" in the sense of the proposed regulation on AI, insofar as
the latter detected images and conversations related to "grooming". This classification should not
constrain/limit investigations based on the use of AI systems for child sexual abuse or the detection
capabilities of online service providers.
Supplier Reporting Requirements (Section 12)
The French authorities welcome this mechanism which allows providers to report to the Center all
content potentially related to sexual abuse. They also welcome the creation of a "flag" system for
content made available to users.
In this context, the French authorities raise the issue of double reporting that could occur for the
same content. They point out that the Member States receive reports directly from public and
private actors, in particular from the NC-MEC (national center for missing and exploited children).
While the French authorities have taken note of the Commission's explanations on this point, they
consider that it would be difficult in practice for the Center to know whether the NCMEC has
actually transmitted an alert to the internal security forces. The French authorities are firmly
opposed to the idea of a deconfliction solution whereby
NCMEC would transmit its alerts
exclusively to the Centre.
In connection with the preceding remarks, the French authorities raise the question of the time
frame within which the internal security forces will have access to the alert from the Centre. At
present, in France, user alerts provide instant information and enable rapid action to be taken.
However, the mechanism provided for in Article 12 involves intermediaries - the Centre - and
additional stages - the assessment of the basis for the alert - which risks lengthening the
transmission chain.
The French authorities note that the current Article 15a of the compromise text of the DSA provides
that when a hosting service provider becomes aware of information giving rise to a suspicion that a
criminal offence posing an imminent threat to the life or security of persons has been committed or
is about to be committed, it shall immediately inform the authorities responsible for the
investigation and prosecution of criminal offences in the Member States concerned. A similar
obligation is laid down in Article 14(5) of the TCO Regulation.
10
The French authorities believe that, at least in situations where ASM content clearly gives rise to a
suspicion that a criminal offence posing an imminent threat to the life or safety of persons has been
committed or is about to be committed - such as a child rape broadcast via live streaming - prior
analysis by the Center seems superfluous. Service providers should be able to automatically remove
the content in question and notify the competent authorities in criminal matters.
Finally, the French authorities question the two deadlines proposed in the regulation:
- On the reasons for inserting a period of 3 months within which the competent authorities will have
to inform the providers via the Center of their willingness to inform the user or not in order not to
harm an ongoing investigation and indicate that they consider this one too short. In this respect, it
proposes
to replace the 3-month period with a more general formulation: "within a period set
by the law enforcement authorities".
- on the relevance of an 18-month period that does not follow any objective criteria. The French
delegation recalls that some users may be involved in long and complex investigations and that it
would therefore be appropriate to
extend this period.
Removal order (Article 14)
As a preliminary matter, the French authorities question the term "remove" and ask the Commission
to clarify whether this term implies a removal of the content from the servers or simply a public
removal of the content.
In addition, the French authorities question the need to call upon an independent
judicial/administrative authority to issue a takedown order, at the risk of making the process
considerably more cumbersome and when the draft text already provides for the intervention of
another independent administrative authority (the "national coordination authority") in the process.
They consider that
an administrative authority should be able to issue removal orders for child
pornography content. They may recall that Article 8 of the DSA provides for the possibility for
national judicial or administrative authorities to issue such injunctions.
In addition, following the example of what is practiced for online terrorist content and based on the
model of the
TCO regulation, the French authorities consider that the
competent national
authorities should be able to issue, on their own initiative (and not necessarily on the proposal of
the national coordinating authority),
injunctions for the removal of content relating to online
sexual abuse of children to the attention of providers.
Moreover, the French authorities question the possibility for the judicial authority to set the duration
of the period of non-information of the user, while the law enforcement authorities can set this
period themselves for the other injunctions. Also, the French authorities note that this period of non-
information is set by the judicial authority after a "simple consultation" of the "public authorities".
The French authorities therefore question this "consultation", which is not binding on the judicial
authority. Finally, on this point, the French authorities question the notion of public authority,
which seems broader than the concept of "law enforcement authorities".
Furthermore, with regard to the duration of the withdrawal, there is a different time limit between
the TCO regulation and the proposed ASM regulation. If the Commission explains this by
differences in the nature of the two regulations, would it not nevertheless be appropriate to align the
procedures provided for in the ASM and the TCO as much as possible?
11
Blocking order (Articles 16 to 18)
The implementation of a URL blocking requires a decryption to access the URL. Beyond the impact
of decryption on the level of security, administrations have, to date, neither the infrastructure nor the
technical capacity to perform this decryption. The French authorities therefore
recommend IP
filtering, either directly or by sinkhole. In addition, the French authorities would like to assure the
Commission of their correct interpretation of the
concepts of "remove" and "disable access". The
choice between "remove" and "disable access" is applicable only to "removal orders" under Article
14. This distinction could allow the host to choose between :
- a removal of the content ("remove") which then becomes inaccessible to all Internet users,
- or a more or less extensive limitation of its access ("disable access"), which would allow, for
example, a host to prohibit access to Internet users in the country that has made the request, while
allowing Internet users in the rest of the world to access it normally.
With respect to Article 16, the French authorities propose
adding the adjective "adapted" to the
text to specify what is expected of providers: "The Coordinating Authority of establishment shall
[...] under the jurisdiction of that Member State to take reasonable and adapted measures to prevent
users from accessing known child sexual abuse [...].
Points to be clarified by the European Commission
At this stage of the analysis, the French authorities have identified two points which, subject to
explanation by the Commission, could hinder the proper understanding of certain provisions.
Article 5(1) provides that: "Providers of hosting services and providers of interpersonal
communications services shall transmit, by three months from the date referred to in Article 3(4), to
the Coordinating Authority of establishment a report specifying the following ".
However, Article 3(4) refers to the costs incurred by the Centre in carrying out the analysis of the
data samples requested by the providers and does not indicate any date as mentioned in Article 5(1).
Thus, it would be appropriate
to modify and replace with the correct reference (Article 3
paragraph 6).
Furthermore, Article 14(5) on the impossibility of carrying out the withdrawal order, particularly in
cases of force majeure, refers to the time limit laid down in paragraph 1 of the same article.
However, the French authorities note that
paragraph 1 of Article 14 does not refer to any time
limit, but paragraph 2, which requires the supplier to comply with the order within 24 hours.
12
GERMANY
General
Germany welcomes the opportunity to comment on the articles of the first two chapters.
Given that the Federal Government has not yet completed its examination of the Regulation,
we would like to enter a
scrutiny reservation.
Chapter I:
Regarding Article 1 (Subject matter and scope):
Paragraph 1 includes the term “uniform rules”.
Are we correct in assuming that the Commission intends to achieve a minimum degree of
harmonisation with its proposal?
We (explicitly) welcome that paragraph 2 focuses on services that are offered in the EU,
thus ensuring a level playing field between providers based in and outside the EU.
Paragraph 3 (b): We would like to ask the Commission once again to clarify the connection
between the Regulation and the Digital Services Act: can the Commission confirm that the
Regulation should take precedence over the Digital Services Act according to the principles
of lex specialis? If so, Germany believes that the wording of Article 1 paragraph 3 (b) (“This
Regulation shall not affect...”) should be revised.
As we understand it, the Commission wants the draft Regulation to serve as a legal basis for
providers of services for the processing of personal data for the execution of detection orders
under Article 6 GDPR and for the envisaged EU Centre as referred to in Regulation (EU)
2018/1725. We therefore ask the Commission to state this more precisely in the draft
version.
Article 2 (Definitions):
Regarding (a) and (b): According to the definitions of the Digital Services Act, “hosting
services” include “cloud computing, web hosting, paid referencing services or services
enabling sharing information and content online, including file storage and sharing”.
Interpersonal communications services also include number-independent voice calls, all
types of emails, messaging services and group chats. We currently believe that the
technological measures which the Commission’s proposal applies to the different types of
services encroaches upon several fundamental rights, and we therefore expressly enter a
scrutiny reservation. Furthermore, we ask the Commission to clarify the proportionality of
the envisaged obligations regarding the different services (including cloud services).
It would be difficult to require providers of cloud services in the dark web to execute
detection orders. We therefore ask the Commission to explain how detection orders should
be enforced in the dark web.
To what extent does the Commission’s proposal take into account further alternative action
in the dark web?
Can private activities such as private hosting of email addresses fall within the scope of this
Regulation?
13
Does the Commission also intend to apply the obligation to providers of data/image hosting
services who only store such content to combat the distribution of CSAM? If so, which
measures does it intend to take?
Regarding (l), (o) and (q): To what extent does the Commission’s proposal take into account
differences in national criminal law? As we see it, such differences also arise with regard to
the definitions of Directive 2011/93/EU:
For example, in Germany, child grooming is punishable when it affects children (under
the age of 14) but not when it affects adolescents (between the ages of 14 and 18).
However, we understand the draft Regulation to mean that attempts to groom adolescents
also constitute solicitation of children and thus online child sexual abuse as defined in
paragraph (p).
The production and possession of juvenile pornographic material is also not punishable in
Germany if it was produced only for personal use and with the consent of the persons
depicted. However, we understand the Commission’s proposal to mean that youth
pornography also constitutes online CSAM as defined in paragraph (p) without
exception.
Chapter II
Section 1 “Risk assessment and mitigation obligations”:
In Germany’s view, imposing more obligations on providers of certain online services is the
right approach for fighting CSA. Requiring more preventive measures can significantly help
to make the online environment more child-friendly and to prevent CSA.
Germany believes that the mandatory risk assessment and risk-mitigation measures which
the proposal calls for can help to improve the targeted protection of children and young
people against harmful media, as long as private communication remains confidential, and
anonymous or pseudonymous use of online services remains possible. However, Germany
believes further specification is needed:
1. We believe binding parameters for risk assessment are necessary in order to significantly
increase consistency and legal certainty.
2. We also believe it is necessary to describe (using examples) which risk-management
measures companies are to take. In our view, this could take the form of examples.
On this point, we also ask the Commission to explain its idea of what constitutes lawful risk
management under the Regulation, possibly on the example of particular services. Germany
does not believe that it would be sufficient to issue specifications in the planned Guidelines
alone (see Article 3 (6) and/or Article 4 (5)).
Germany is in favour of stricter enforcement of age assurance and age verification measures
to mitigate risks and with regard to the obligations of providers of software application
stores, as long as the services in question can continue to be used anonymously and
pseudonymously. We therefore ask the Commission to describe its support for initiatives for
age assurance and verification measures which require a minimum of data (see the BIK+
strategy). What are the specific approaches in this area?
14
Section 2 “Detection obligations”:
Germany is in favour of uniform, Europe-wide obligations for certain providers to identify
known CSAM.
In view of the fundamental rights concerned and the possibility of false positive reports
(which cannot be ruled out no matter which technology is applied), we are currently
conducting an intensive review of the options for identifying new CSAM and grooming and
will have more to say about this at a later time.
We are also carefully reviewing the multi-step procedure proposed by the Commission for
issuing detection orders and will comment on it in greater detail later. To aid in visualising
the planned processes, we ask the Commission to provide a diagram illustrating the various
steps in the planned procedure for issuing detection orders.
Article 7 (4) states that “evidence of a significant risk” is required before a detection order
can be issued. Paragraphs (5), (6) and (7) provide further details as to what constitutes a
significant risk. Germany nonetheless believes that additional specification is needed to
define “significant risk” with legal certainty and ensure that the CSA Regulation can be
applied uniformly. Germany believes that such specification should be included in the text
of the Regulation itself, rather than only in the Guidelines.
According to the proposal, providers of hosting services of publicly disseminated
communication can be required to identify online CSA. It is both the responsibility and in
the interest of providers to keep their publicly accessible platforms from being used to
disseminate online CSA (see for example the statement by Meta at the CSA seminar in Paris
on 14–16 June).
Germany welcomes the Commission’s technology-neutral approach. Providers of
interpersonal communication services too are responsible for preventing the dissemination
of online CSA via their services; it should therefore be possible to require them to do so.
With regard to the technologies to be used, however, we still see an urgent need for
clarification, especially concerning the following points:
o The Regulation must not lead to general interception of private, encrypted
communication where there is no suspicion of wrongdoing.
o Germany is in favour of seamless, secure end-to-end encryption which must not be
undermined, neither in technical nor in legal terms. This is one objective of the
Coalition Agreement of Germany’s Federal Government, as is the fight against child
abuse.
With this in mind, Germany believes it is necessary to state in the draft proposal, for
example in Article 10 (3) (a) (new), that no technologies will be used which disrupt, weaken
or modify encryption. The Federal Government is still in the process of reviewing the use of
other technologies.
In view of the fundamental rights concerned, it is necessary in the interest of proportionality
to ensure that the technologies to be used are sufficiently sophisticated and fit for purpose,
with a minimal error rate.
15
GREECE
Introduction:
Initially, we would like to provide some general remarks, outlining our afterwards interventions:
The Greek competent authorities for the fight against the online CSAM face the following primary
operational deficiencies: 1) Encrypted communications obstruct the success of criminal
investigations and seriously harm their effectiveness, 2) Public WI-fi provide a safe internet
connection to the perpetrators since the users do not oblige to declare the mac (media access
control) address of their device for the access, 3) The availability of the NAT (Network Address
Translation) and the VPN (Virtual Private Networks) create significant difficulties for the detection
of the IP address and the identification of the perpetrator and 4) The variety of data retention
periods even in the EU (e.g., it is one year in Greece and only a few days in an another Member
State). Additionally, the perpetrators of this type of crime are moved on the dark web or exploit
selected encrypted messaging services based on the denial of their handlers to provide the necessary
data.
From the legal perspective, detecting, removing, and blocking CSAM in cyberspace constitutes an
interference with the rights of personal life, personal data protection, expression, and confidentiality
of communications. Consequently, these actions must be subject to end-to-end safeguards,
complying with the principles of necessity and proportionality in all stages of the process.
Regarding the technological domain, we have to pay attention to the current reliability and accuracy
of the tailored technologies. Our legislative efforts should be based on independent public
assessments and not only on outcomes derived exclusively from private companies.
To conclude, we propose to examine the necessity of the establishment of the Centre at this stage,
because the new Centre is referred from the first articles.
Article 2 (definitions):
We will express two modifications and one new suggestion concerning the definitions of article 2.
Par. (m) stipulates that the known CSAM means potential CSAM. We propose to use the word
unconfirmed or unverified instead of known because the existence of the words known and
potential complicates the meaning of this definition.
In the same spirit of legal clarity on par. (n), we ask for the deletion of the word new and its
replacement with the word suspicious.
Furthermore, we suggest inserting a new definition for the indicators, aiming to underline their
importance for detecting suspicious CSAM and, simultaneously, to reduce the conception of
preventive mass surveillance of online activities, including interpersonal and encrypted
communications.
Finally, we support the French proposal to harmonize the age of 18 on par. (i) and (j).
Article 4 (risk mitigation):
One question for the Commission concerning the par. 4 and the last phrase, "
That description shall
not include information that may reduce the effectiveness of the mitigation measures." Which is the
consideration for this provision? Could the Commission mention particular examples?
16
Article 5 (risk reporting):
A question for the Commission. How is the consistent management of the risk assessments by the
various Coordinating Authorities ensured, refraining from different handling?
Article 7 (issuance of detection orders):
For this article, we declare a scrutiny reservation. In principle, we agree with the approach by the
Commission, following the relevant case-law of both the Court of Justice and the European Court of
Human Rights. For instance, one of the prerequisites is the decision to be issued by a judicial or an
independent administrative authority. We are coming back to the structural matter of data retention.
How does the Commission consider the implementation of a detection order in a Member State
when its data retention period is too limited?
17
HUNGARY
HU fully supports the objectives of the draft regulation; however, we have some general comments
regarding its approach on certain important elements.
The proposed legislation appears to have a complex enforcement structure, with no clear or well-
defined competences, even though it builds on the solutions used in the draft Digital Services
Regulation (hereinafter "DSA") and in Regulation (EU) 2021/784 of the European Parliament and
of the Council of 29 April 2021 on combating the dissemination of terrorist content online
(hereinafter "TCO"). According to the TCO Regulation, the coordinating authority and the judicial
or independent administrative authority are one and the same, but they are separate authorities in the
draft Regulation laying down rules for preventing and combating sexual abuse of minors
(hereinafter 'CSA'). A simpler solution is for the competent authority to be able to issue blocking or
removal orders itself, rather than having to go to a separate judicial or administrative authority. The
burden on the coordinating authorities is heavy and duplications should be avoided, it would be
difficult and costly to set up a national enforcement structure in line with this proposal.
The limitations of URL-based screening in the draft proposal could undermine the effectiveness of
the CSA Regulation and it would therefore be appropriate to include digital fingerprint-based
screening among the technical options.
In Hungary, the problem of end-to-end encryption, which makes it difficult to detect certain crimes
and to access and use electronic evidence in criminal proceedings, poses a significant challenge, and
it is therefore essential to create the technical conditions for law enforcement agencies to have
access to e-evidence, while ensuring appropriate safeguards. In order to act more effectively,
possible solutions in this area need to be explored and the Europol Innovation Lab will increasingly
provide priority support in this exploration.
We agree with the delegations that called for Europol to be involved in the negotiations on the draft
as soon as possible.
It is not clear from the proposal how the new institutional system will draw on the experience of
INHOPE and the Member States' Internet hotlines and incorporate them into the institutional
system.
In Article 83(2)(a), the proposal provides for data collection based on "gender"; however, for
Hungary only the data collection on the basis of sex would be acceptable.
The role, competences, and location of the EU centre to be established should be deeply discussed.
Chapter I
We agree with the subject matter and scope set out in Article 1 of the draft, with the reference in
Article 1(4) to Chapter 2, Section 2 of the Regulation.
The definitions need to be reviewed. In Article 2 of the draft, we propose to include in point (j) of
the definitions an age limit of 18 years or a reference to the age of consent of the Member States, 17
years being unacceptable in this form. We suggest to change it in coherence with the previous
definition, or refer to the different interpretation within the MSs.
18
Chapter II
The title of the chapter does not reflect its content. Sections 2 and 4 already deal with the issuance
of a detection and removal decision, which concerns the role of the coordinating authority rather
than that of the service provider. The wording of the regulation is very far from meeting the
requirement of clear, unambiguous and transparent regulation. It would be good if these powers
could be merged or restructured.
In Article 3 par 4 (Subsequently, the provider shall update the risk assessment where necessary and
at least once every three years from the date at which it last carried out or updated the risk
assessment) the timeframe looks a bit too long, this assessment should be a living exercise
The question of whether the condition in Article 7(4)(a) is fulfilled is partly a police matter, while
all other tasks could be carried out by a designated authority, as in the TCO Regulation. The
wording of Article 7(4) is incorrect, as it seems to completely exclude the discretion of a judicial
authority or an independent administrative body, whose decision is formal if the conditions are met.
If this is the aim, it also seems more realistic to concentrate powers in the hands of the judicial
authority or the independent administrative body.
The language of the orders as defined in sections 2 and 4 should be the official language of the
issuer and English, not the language requested by the service provider. Significant additional
administrative burden and costs may be induced by translations. We require here a ruling on the
official language of the Coordinating Authority and English.
Immediate fulfilment of the information obligation in Section 3 Article 12 (undue delay) may cause
problems for law enforcement action and should be suspended, if possible, pending the reaction of
EU headquarters. Immediate compliance with the obligation to provide information may cause
problems for law enforcement action, which should preferably be suspended pending the reaction of
EU Centre.
The provisions on victim protection and support services and their information, as set out in Articles
20, 21, do not reflect the fact that victims are necessarily children. There are no rules on
representation, the situation and consequences of the sexual exploitation of children within the
family are not addressed, and no reference is made to the relevant EU rules in force. We are talking
about children victims here, thus we need a very detailed explanation here on requirements and
obstacles. The proposed legislation does not cover rules on representation and protection against
criminal parents as legal representatives. In accordance with the first two paragraph of Article 21
we should refer on the applicable EU legislation concerning victim protection and support, and we
should channel these activities into the existing mechanisms in this field.
Article 22 requires service providers to keep relevant data. The proposal sets a general retention
period of 12 months. However, the draft sets long procedural deadlines in a number of places and,
although it is stated that derogations from this general deadline may be made to meet specific needs,
it would be preferable to increase this general deadline significantly. We should keep the data until
these procedures ends. Deadline mentioned above in this text are much longer in anyway. We
suggest to open the possibility for 5 years in this proposal.
19
Chapter III
Our view is that the coordinating authority's remit should be reviewed. Hungary can cover these
competences, but not in one organisation. It would also be unwise to codify such a complex
organisation at the level of EU regulation, as this approach would generate conflicts of competence
and duplication. The tasks of the authorities and the police are mixed up and do not build on each
other in a logical way. We want to build on our existing capacities, with appropriate coordination.
Article 26-30 of the draft expects an independent authority as coordinating authority, on the
initiative of which another independent authority will have to take a decision, which seems to be an
unnecessary duplication. The competences of the coordinating authority include investigative,
analytical and evaluative elements. This cannot be done by an independent administrative authority,
and the police service should not be burdened with unnecessary coordination and administrative
tasks. The possibility of designating other supporting competent authorities is only mentioned in the
draft, and then there are no further references to them, so it is not possible to define their role. The
system of complex cooperation at national level should not be interfered with in such a deep way, it
is proposed to follow the methodology of the TCO.
In Article 35, the level of fines imposed does not converge with existing EU legislation, we see no
clear justification for this. We don't understand why this number was chosen; for the TCO it is 4%,
the GDPR also. Is this an area that requires more severe sanctions?
The title of Articles 31 and 38 should be modified, their substantive consequences should be
clarified, and the draft should not touch on criminal procedure issues. These monitoring activities in
Article 31 are normally channelled also to the law enforcement task. Article 38 cannot be defined as
investigation from criminal procedure point of view.
Article 36 par 1 rules that where the Commission has reasons to suspect that a provider of relevant
information society services infringed this Regulation in a manner involving at least three Member
States, it may recommend that the Coordinating Authority of establishment assess the matter and
take the necessary investigatory and enforcement measures to ensure compliance with this
Regulation. We would like to know what is the legal basis and information that allows the COM to
come to such a conclusion, and where is the background to this in this draft.
Chapter IV
Article 42 designates The Hague in the Netherlands as the seat of the EU Centre. This was objected
by several member states. This solution seems logical in terms of efficient use of capacity and the
need for close cooperation with Europol, but it should still be a decision for Member States. We
support liaison via liaison officers. We believe that more detailed rules are needed for the
relationship with Europol.
20
Chapter V
Regarding the data collection and transparency reporting more detailed analysis is needed, as it
seems to be a bit too detailed. Not just statistics, but detailed activity reports from Member States is
required. For coordinating authorities, this detailed data provision will be a significant burden.
As mentioned already at the general remarks, Article 83(2)(a), second indent foresees the collection
of data on the basis of "gender", which we do not accept. According to the horizontal Hungarian
position, we reject the concept of gender, and for us the collection of data based on "sex" is
appropriate. Therefore, Article 83(2)(a), the collection of data based on "gender" should be replaced
by the word "sex". For the Hungarian side, we reject the concept of gender as such, in our view
there is only sex. Furthermore, in reality, the authorities collect data only on the basis of sex, so the
mandate cannot be fulfilled in this way.
We try to be as constructive as possible during the negotiations and we will provide our more
detailed position within the framework of the discussions within the LEWP.
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
IRELAND
Ireland is strongly in favour of the Regulation as a whole and is keen to ensure that the measures it
introduces are both effective and efficient.
Ireland repeats this general comment made at the Working Party, which broadly relates to a number
of articles: we have concerns in relation to the range and complexity of the responsibilities placed
on the national Coordinating Authorities and we continue to scrutinise all references to national
authorities. In order to assist Member States’ understanding on this aspect of the proposal, we
repeat our suggestion, made in earlier written comments, for flow charts setting out the
Commission’s understanding of how the national coordinating authorities will interact with each
other and other bodies. It might also be helpful if the Commission could enumerate all the tasks it
foresees the Coordinating Authorities undertaking.
We have similar concerns around efficiency and complexity in relation to the responsibilities given
to the “judicial authority or independent administrative authority”. It is our understanding that the
Regulation requires that Member States make provision for the role of this second national
competent authority in addition to the Coordinating Authority. We note that the Presidency paper
accompanying the upcoming discussion at the informal COSI states “Member States may appoint
one or more national competent authorities”. Does this mean one or more, in addition to the
“judicial authority or independent administrative authority”?
In Ireland the Courts are our “judicial authorities” – is it intended that we should go to the Courts
for approval for the issuance of every detection, removal or blocking order? Alternatively, if we go
down the path of an “independent administrative authority” this raises the question of why we are
creating two separate new independent national authorities to deal with the same matters? Although
we are very aware of the need for safeguards and accountability, we have reservations with the level
of complexity involved.
We also have some comments that were not made at the Working Party. Again, we are supportive
of the principles underlying the process laid out in the Regulation whereby detection order follows
risk mitigation follows risk assessment, but we are trying to understand the practical implications.
One issue that has been raised with us by a prominent online service provider (and no doubt raised
also with the Presidency and Commission) is that the Regulation will stop companies from
continuing to use techniques which prevent harm from happening online in the first place. The
company claims that the proposal does not provide a legal basis for companies to process
communications metadata to tackle child sexual abuse in the absence of receiving a detection order
from a member state authority. There are prevention techniques which are currently deployed
which would no longer be allowed under these proposals.
44
Ireland regards prevention as very important and would be concerned that there could be a lengthy
period in which the legal basis for voluntary detection was removed but before any DO had been
issued. The risk is greater when we consider the possibility that we cannot know how long it will
take for the first DOs to be issued, or even be sure that will be issued at all. By this we mean we are
creating a process in which Coordinating Authorities, which are required to be completely
independent, and independent judicial or administrative authorities, all have to decide that a DO is
justified, and any challenges to these decisions must be overcome. So in addition to taking some
time, the outcome of the process cannot be certain. Is there any way of introducing the process
envisaged by the Regulation but also ensuring that the preventative measures currently being
employed, which have been shown to be effective, can continue?
Ireland expects to have further comments to make in relation to Article 7 specifically, and the ways
in which national authorities, other bodies and the EU Centre interact in general. We will share
these in due course.
From a drafting point of view we would point out the below errors:
- Article 4.2.d has a reference to 3.4 that should read 3.6.
- Art 5.1 chapeau has the same mistake.
- Art 5.1.a refers to 3.5 when it should read 3.7.
45
ITALY
With regards to the discussion on the 5th July regarding CSA Proposal we would like to recall our
previous comments on articles from 1 to 7.
We really appreciated the CION replies and the opportunity granted to share a work flow scheme
to better understand the roles, powers and prerogative of the different actors involved in the
Regulation (PP, CA, EU Centre). This would be disseminate at national level in order to facilitate a
deeper evaluation on the impact on national legal and operational framework.
Please consider also that since national assessment on the proposal is still pending, we have a
general scrutiny reservation on the text.
46
LATVIA
GENERAL PRELIMINARY COMMENTS
LV agrees that work on prevention and combating of child sexual abuse (CSA)
has to be
intensified.
LV also agrees that
voluntary measures by providers to detect and report CSA
have proven
insufficient.
LV continues assessing the proposed CSA Regulation. Thus, LV maintains
general scrutiny
reservation.
DETAILED PRELIMINARY COMMENTS
Article 3 “Risk assessment”
LV finds it important that almost all hosting service providers and interpersonal communications
service providers have
at least 6 months to carry out the first risk assessment. LV understands that
in accordance with Article 3(4) the first risk assessment has to be carried
by 3months after the
date of application of the proposed CSA Regulation (it shall apply from six months after its entry
into force that is on the twentieth day following that of its publication in the Official Journal of the
European Union). Thus, in practice, the relevant providers already offering their services in the
Union will have
9 months from the entry into force of the proposed CSA Regulation to prepare the
first risk assessment, that, in LV view, is sufficient. In view of this, LV considers that
at least 6
months should also be granted to those providers that did not offer the service in the Union by the
date of application of the proposed CSA Regulation (currently 3 months).
Article 5 “Risk reporting”
LV believes that in the second sentence of Article 5(3)
a reference to Article 5(2), not to the first
subparagraph should be (LV considers that this provision refers to the suspension of the 3 months’
period related to the assessment and determination of the Coordinating Authority of the
establishment referred to in paragraph 2 of this Article).
Chapter II “Obligations of providers of relevant information society services to prevent and
combat online child sexual abuse”
General comment: LV would like to clarify, whether after the entry into application of the proposed
CSA Regulation (when the Interim Regulation (Regulation (EU) 2021/1232) ceases to apply),
relevant service providers will be able to continue the
voluntary detection of the CSA on the basis
of the proposed CSA Regulation (as COM previously pointed out the issuance of the first detection
order could take
approximately 1 year). If the answer is affirmative, LV would like to draw
attention to the fact that in such case certain service providers (who will not be issued a detection
order, but who will nevertheless continue voluntary detection of CSA) will continue making their
own decisions regarding fundamental rights, as well as there will not be harmonized guaranties (so
far COM mentioned that one of the aims for the mandatory detection of CSA by relevant providers
was to eliminate such situations).
47
Article 7 “Issuance of detection orders”
LV notes that in accordance with Article 3(4)(a) a hosting service provider or interpersonal
communications service provider whose service is subject to a detection order issued in accordance
with Article 7 has to update the risk assessment at the latest two months before the expiry of the
period of application of the detection order. In view of this, LV considers that Article 9(3) should
set not only a maximum period of application of a detection order, but also
an adequate minimum
one (for example, 6 months).
LV would like to understand whether the application period of the issued detection order can be
extended, as well as the procedure for the issuance of a new detection order, namely, whether in
practice there can be a situation where a hosting service provider or an interpersonal
communications service provider is not required to make a mandatory detection of CSA in a
particular service for a certain period of time despite the high risk of dissemination of CSA there.
48
LITHUANIA
Please be informed, Lithuania strongly supports the new EU Commission’s initiative regarding
Regulation on combating child sexual abuse. Unprecedented growth of numbers of child sexual
abuse all over the world on the Internet calls EU member states to be united to tackle it. We would
like to highlight, that it is appropriate to assess the proposed regulation not only in the context of
proportionality with human rights but also in the context of the personal data protection. We support
measures that clearly describe the obligations for digital service providers to respond, to
assess and to remove immediately the illegal content online.
However,
we are reserved about the establishment of a separate EU centre. We do understand
that the envisaged functions of the centre are crucial and necessary in addressing child sexual
exploitation, but the nature of the functions is specific and covers a rather narrow field.
Additionally, it is questionable whether with the establishment of a/m centre will not provoke the
delays in the process of the information exchange with law enforcement and deletion of the illegal
content online, as it will be an extra chain in the whole process.
Lastly, we would like to take a
scrutiny reservation to the whole proposal itself as the internal
discussions with relevant partners in the capital have just started and due to the complexity and
volume of above mentioned Regulation, we need more time to dig deeper in the details and address
this proposal respectively.
49
MALTA
General considerations
In principle, the Maltese government supports this proposal. At this stage, Malta joins other
Member States in entering a general scrutiny reservation. It is important to set out clear aims and
objectives and how these are going to be implemented by both the private sector and public
authorities. To this end, the legislative proposal should not present a complex approach which
would decrease the effectiveness of its aims and objectives.
Malta welcomes the references to hotlines used to report online child sexual abuse to be afforded
the necessary recognition in this legislative proposal. With the current text, it is felt that an
emphasis of the involvement of the hotline organisations in child sexual abuse material and notice
takedown is not adequately reflected in the recitals and operative text. It is therefore imperative to
articulate this involvement better for such hotline organisations to continue receiving reports and
issuing notice takedowns.
-
Article 1
The wording used is reflective of the balance that needs to be found between preventing and
combatting child sexual abuse while safeguarding the rights and interests of users of the targeted
information society services, in particular to protect the integrity and importance of end-to-end
encryption. To this end, Malta looks forward to the opinion of the European Data Protection
Supervisor on this legislative proposal.
Another important point is that because of the fact that this
lex specialis is far-reaching, the specific
nature of the judicial and administrative organs and their cooperation with the proposed
coordinating authority need to be clear.
In terms of paragraph 2 of Article 1, should this be understood as the scope of the proposed
Regulation applying for both intra-EU cross-border information society services as well as those
outside the EU which do not have a main establishment? This may require clarification.
In addition, the legislative proposal is similar to Regulation (EU) 2021/784 in some aspects. Again,
with reference to paragraph 2 of Article 1, should it be therefore understood that in the intra-EU
case, the competent authority of one Member State can issue detection/removal/blocking orders to a
relevant information society service established in another Member State or are these orders to be
issued to a provider of services by the Coordinating Authority under which that relevant
information society service is established? The latter seems to be the case on reading the respective
articles on the issuance of the orders. Therefore, clarification on this may be required.
-
Article 2
The inclusion of two definitions in para (i) and (j) for ‘child’ and ‘child user’ respectively suggests
that this twofold approach is required to address child sexual abuse material and solicitation. Some
Member States have asked for this to be removed and to retain one definition with a general age of
18 years.
50
While Malta is preliminarily in favour of this, we wish to have further information on whether this
has been included because of the following reasoning: the definition of ‘online child sexual abuse’
includes both online dissemination of child sexual abuse material and solicitation of children,
therefore, the definition of ‘child’ is being used to provide for instances of persons under the age of
18 years who are the subject victims of child sexual abuse material, whereas, the definition of ‘child
user’ is being used to provide for instances of persons under the age of 17 years who are susceptible
and/or vulnerable to instances of solicitation which leads to child sexual abuse in online and offline
sexual activities. This distinction is being made to obligate relevant information society services to
not allow ‘child users’ to download high-risk software applications (as per article 6 and the example
used by the Commission in the LEWP meeting of 5 July 2022). Can you kindly confirm this? Did
the age of sexual consent have any bearing on the decision to have two definitions? An opinion
from the Council Legal Service on aligning these two definitions would be welcome, to this end,
Malta supports other Member States on this request.
On the lack of definitions regarding the ‘competent judicial authority’ or ‘independent
administrative authority’, Malta would be open to examples of authorities which the Commission
would envisage being given the role.
-
Article 3
With reference to the risk assessment, Malta adheres with the first principle of this legislative
proposal, that is to prevent, but this should not result in overburdening the operations of relevant
information society services. It should be clearer when these kinds of assessments are to be carried
out and under which circumstances. On paragraph 4 of Article 3, this risk assessment should be
indeed a continuous process which should have clear binding rules. Malta joins other Member
States in requesting an illustrative presentation on how the risk assessment would work and any
measures for non-compliance to the requirements for this assessment.
Article 4
The removal of child sexual abuse material is more effective when ‘trusted flaggers’ as specialised
entities with specific expertise collaborate with online platforms and law enforcement authorities.
Using this expertise can result in higher quality notice and take-down. It would be beneficial
therefore for the wording on their inclusion in paragraph 1 of article 4 to be strengthened possibly
by omitting the option to select which measures are chosen by the provider, rather than relying on
the providers to decide.
Article 7
Malta joins other Member States in requesting an illustrative example of the issuance of the orders.
For now, the process is being viewed as complex. Malta supports the concerns raised by other
Member States. The traditional roles of judicial and law enforcement authorities are not clear; how
will the law enforcement authorities operate in terms of this legislative proposal? The provisions
indicate that the coordinating authority will be collecting the evidence and making the case for the
orders, with other authorities then deciding whether to take it forward. What happens if the online
child sexual abuse is first presented to the law enforcement authority? Is it the case that it will feed
this information to the coordinating authority and its responsibility stops there?
51
THE NETHERLANDS
The Netherlands acknowledges the problem of Child Sexual Abuse Material (CSAM) and the
urgency to fight this horrible crime. In recent years, the Netherlands has made great efforts to
reduce the amount of CSAM on Dutch networks. The Netherlands is a big proponent of a joint
European approach to combat child sexual abuse material, given the fact that the Internet crosses
national boundaries. We are therefore pleased that the European Commission has published a
proposal to make the fight against child sexual abuse more effective in Europe. We applaud the
efforts of the Commission to strengthen the European fight against CSAM and we welcome the
proposal, although we also have various questions and concerns. The Netherlands appreciates the
possibility to ask questions about the proposal and looks forward to the Commission’s responses.
The Netherlands appreciates it if the Commission can clarify some questions about the articles 1 –
7.
Article 1(c): Article 1 under C only mentions hosting providers. Can the Commission clarify why the mandatory
removal or disabling access of CSAM does not apply to interpersonal communication services?
Article 2(j):
Regarding the definition of ‘child user’ in Article 2(j), we want to ask why a child (i) is defined as
someone under 18 and a child user as someone under 17? It might make more sense to define ‘child
user’ as “
a child who uses a relevant information society service”. This since ‘child’ already has
been defined.
Article 3, 4 & 5: Why, as opposed to Terroristic Content Online-Regulation (TCO), is decided that all HSP’s and
ICS in general need to have a mandatory risk assessment (
Article 3), take mitigating measures
(
Article 4) and to impose a reporting requirement (
Article 5)? We can imagine that this stems from
the desire to reduce the amount of CSAM as much as possible.
The question is if these requirements are still proportionate in relation to the goal? In other words,
how many providers are affected by these measures and can the Commission clarify why it thinks a
general obligation is necessary to reduce CSAM?
Article 5:
Concerning the coordinating authority, we wonder what is the relationship with the coordinator
mentioned in the digital services act (DSA), the authority mentioned in the TCO-regulation and this
coordinating authority.
Article 6: Article 6 requires providers of software applications to consider whether their service poses a risk
of abuse for grooming purposes. This requires some clarification. The Netherlands wonders at what
risk measures are justified and what kind of measures it should think of?
52
Article 7:
a) Firstly, we wonder what are the implications of a detection order at Interpersonal
Communication Services on encryption? Can such an order be fulfilled without breaking
(end-to-end) encryption? Furthermore, we are curious how the Commission wants to
determine if grooming is taking place? Also, how is the right to respect for private life (and
to communication/correspondence) as mentioned in Article 7 of the Charter and Article 8 of
the European Convention for the protection of Human Rights and fundamental freedoms
protected? How is it ensured that an intrusion into someone’s personal life meets the
guarantees mentioned in that same article (necessity, proportionality, subsidiarity)? We can
imagine that it is more difficult to establish whether grooming is taking place. Therefore it is
likely that it is necessary to make a greater infringement on personal life than in the case of
CSAM.
b) How does the obligation to detect under the detection order relate to Article 15 of the
Electronic Commerce Directive (Directive 2000/31/EC) and Article 7 of the future Digital
Services Act (DSA), respectively, which state that Member States may not impose a general
obligation on service providers to monitor the information they transmit or store, or to
actively seek out facts or circumstances indicating illegal activity? Are these provisions
compliant with the Telecom Code1 and ePrivacy directive?
c) Another question regarding the detection order concerns the specific moment at which
hosting providers are required to detect CSAM and grooming. Is the scope of the detection
order limited to
published content? Or are hosting providers also obligated to detect material
before is it published?
Article 7(1):
The TCO regulation explicitly states in Article 5(8) that the obligation to take specific measures for
hosting service providers does not include a general obligation to monitor the information they
transmit or store, nor a general obligation to actively seek facts or circumstances indicating illegal
activity. In addition, the obligation to take specific measures under TCO regulation does not include
an obligation for the hosting service provider to use automatic tools. This while the CSAM-
regulation provides for measures to be taken as a result of a detection order under Article 7(1) in
conjunction with Article 10(1) in conjunction with Article 46 of the Regulation. Why has the
Commission chosen for these different approaches?
Article 7(1) The Netherlands wonders why the Commission chose for this specific structure, in which the
coordinating authority asks another judicial or administrative authority to issue a detection order?
Why doesn’t the coordinating authority do this himself in accordance with regulation TCO?
1 Telecomcode (EU) 2018/1972 , ePRivacy 2002/58/EG
53
Article 7(3) c The implementation plan should be accompanied by the opinion of the Data Protection Authority.
What would be the nature of the Data Protection Authority assessment?
Article 7(7) It is conceivable that some ICS’s are an easy tool for grooming by their very nature, since their
main service is providing communication between persons. Can the Commission reflect on the
scenario when an ICS has done everything in its power to prevent its service from being abused, but
grooming still occurs with the use of this service?
Article 7(9)
The period during which a detection order may apply runs from three months up to twelve months.
In case of known or new child sexual abuse it may even run up to 24 months. Considering the
impact of the (execution of the) detection order on the fundamental rights of its users, this seems to
be quite an extensive period. How did the Commission establish that these minimum periods of
three months and maximum periods of either 12 or 24 months would be suitable and necessary for
the providers to take the necessary measures to prepare and execute detection orders? Furthermore,
is the amount of users affected by a detection order a relevant parameter that must be taken into
account when issuing a detection order?
54
PORTUGAL
Following the request for comments by 8.7.2020 (Articles 1 to 7), the Portuguese delegation recalls that
it has submitted a scrutiny reservation.
PT wishes, nevertheless, to contribute to the discussion with the following observations:
It should be made more explicit which type of European funding is referred to
on page 3 of the
explanatory memorandum, bearing in mind that national bodies, namely the police, have already
made several investments.
Article 1(1): harmonization and the reference to the internal market seems excessive, especially
since there are rules that do not contribute to the harmonization of decisions, for example in Article
35 on penalties, which does not really promote the internal market.
It would also be suitable to insert in this article the obligations imposed on software application
stores resulting from article 6 and, more clearly, the obligations of each of the entities provided for
in Article 2 f), as follows :
1.
This Regulation lays down uniform rules to address the misuse of relevant information
society services for online child sexual abuse.
2.
It establishes, in particular that:
(a) all providers of relevant information society services are obliged to minimise the risk
that their services are misused for online child sexual abuse;
(b) providers of hosting services are obliged to detect, report, remove or
disable online child sexual abuse;
(c) providers of interpersonal communication services are obliged to detect and report
sexual abuse material on their services;
d) software application store is obliged to assess whether any application that they
intermediate is at risk of being used for the purpose of solicitation and, if this is the
case and the risk is significant, take reasonable measures to identify child users
and prevent them from accessing it
(d) providers of internet access services are obliged to disable access to child sexual
abuse material;
(e) rules on the implementation and enforcement of this Regulation, including as
regards the designation and functioning of the competent authorities of the Member
States, the EU Centre on Child Sexual Abuse established in Article 40 (‘EU
Centre’) and cooperation and transparency.
As to the second part of
no. 2 of Article 1, we suggest to consider the main place of establishment
(second part of no. 2 of art. 1).
However, we have serious doubts regarding the characterization of the "providers of relevant
information society services", under the terms of no. 2 f), due to subparagraph w).
55
Article 2 – Regarding subparagraph (m) and (n) it should be stressed that it t is not necessary to
use the term "potential", as this qualification can be misleading.
With regard to subparagraph p), it should be noted that the proposed definition of "online sexual
abuse" as "online dissemination of child sexual abuse" does not correspond to the concept contained
in REGULATION (EU) 2021/12322 with regard to the use of technologies by providers of number-
independent interpersonal communications services to process personal and other data for the
purpose of combating child sexual abuse online, Article 2, paragraph 4, which does not include the
term "dissemination".
In addition, the Child Sexual Abuse Directive uses the expressions "distribution, dissemination or
transmission".
This is of particular concern as it may have implications for the scope of the proposal. It is
questionable whether simple uploading, considered to be the action of sending data from a local
computer to a remote server, could be covered by the notion of dissemination.
As regards
subparagraph w) PT considers that the understanding of the European Data Protection
Board,
available
at
https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_3_2018_territorial_scope_pt.pd
f, is still valuable : in fact the proposal adopts a "formalistic approach according to which
companies are only established where they are registered". Yet it would be important to take into
account the level of stability and the specific nature of the activities in the MS.
Therefore we point out that more proximity to Article 4(16) of the GDPR, would be desirable.
Subparagraph l): PT believes that there are areas that allow for a difficult interpretation of what
is effectively material to be detected. See in particular Article 5, paragraph 7 of the Directive on the
sexual abuse of minors and also the conditions under which paragraph 8 of the same article is
applied.
Article 6 - PT recalls the question raised at the last meeting concerning the use by adults of
applications intended for children (impersonating children), that should be looked upon
Article 7 - PT notes that the structure of the document is conducive to some confusion, since
conceptually it would be clearer not to spread the competences of the national coordinating entities
over several chapters.
2 REGULATION (EU) 2021/1232 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 14
July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC
56
SPAIN
Spain supports all measures to strengthen the detection and surveillance of child pornography and
other sexual abuse of minors on the Internet and the idea of encouraging the cooperation of
companies that offer services on the web in order to develop prevention strategies. However, this
legislative development is very complicated and involves several actors, which is why Spain has a
scrutiny reservation on this issue. Having said that, Spain has a general comment to share:
General comment on the scope of protection: Sexual Abuse against Children and
Vulnerable
People: The Convention on the Rights of Persons with Disabilities from the United Nations (UN-
CRDP) states in article 16.1 that “States Parties shall take all appropriate (…) measures to protect
persons with disabilities (…) from all forms of exploitation, violence and abuse, including their
gender-based aspects”. Individuals with intellectual disability (ID) are more likely to experience
sexual abuse and less likely to report it. Consent is crucial when anyone engages in sexual activity,
but it plays an even greater, and potentially more complicated, role when someone has a disability.
Some disabilities can make it difficult to communicate consent to engage in sexual activity, and
perpetrators may take advantage of this. Persons with disabilities may also not receive the same
education about sexuality and consent that persons without disabilities receive. In addition, a person
with an intellectual or developmental disability may not have the capacity to consent to sexual
activity as defined by state law. All of these factors make this group more vulnerable to sexual
abuse online, which is why Spain believes that the scope of protection in this regulation should be
extended for vulnerable persons too.
57
Document Outline