• In this regard, the EDPS point out, as overarching principle to be complied with, that any
measure restricting the fundamental rights and freedoms should be necessary and
proportionate16, which implies that they should be
as targeted as possible.
• In accordance with this principle, the EDPS recommends introducing in the Proposal an
obligation for HSPs,
before they put in place any proactive measure, to:
(i) Perform and make public a
risk assessment on the level of exposure to terrorism content
(also based on the number of removal orders and referrals received);
(ii) Draw up a
remedial action plan to tackle terrorist content proportionate to the level of
risk identified17. The aforesaid assessment and action plan would also serve the purpose of
representing useful
accountability tools for a periodic review of the measures.
As further accountability tool, HSPs should perform a
periodic reporting on the actions
taken and on the residual level of threat (exposure to terrorist content).
3.3.2. Use of automated tools in the context of proactive measures and safeguards
regarding the use of such measures
• Recitals 16 and 18 and Article 6(2) specifically provide that proactive measures may include
the use of automated tools. The EDPS stresses that such automated tools should only be
used in a
cautious and targeted way, on the basis of the outcome of the risk assessment
referred to in section 3.3.1. of these formal comments.
• The EDPS stresses that the procedures envisaged by the Proposal in some, if not most, of the
cases, lead to the
identification of the user who has uploaded the terrorist content (it is the
case of the preservation of data related to removed content to be stored by HSPs under Article
7 and possibly accessed by law enforcement authorities; of a complaint mechanism lodged by
the user under Article 10; of the provision of information about the removal by the HSP to the
user).
• In this respect, the EDPS also draws attention to the fact that it cannot be excluded that the
proactive measures by HSPs, including automated tools, for recognition and removal of
content uploaded by users can also be considered as ‘automated decision-making including
profiling’ in the meaning of Article 22 of the GDPR.
• The EDPS recalls that Article 22(1) of the GDPR provides a
general prohibition of solely
automated individual decision-making, which produces legal effects or similarly significant
effects on data subjects. However, Article 22(2) of the GDPR foresees exceptions to this
general prohibition and sets out specific cases and requirements under which such decision-
making is permissible. In particular, Article 22(2)(b) of the GDPR provides that Union or
Member States law can authorise such decision-making when it also lays down “
suitable
measures” to safeguard the data subject’s rights and freedoms as well as legitimate interests.
In this respect, Recital 71 of the GDPR stresses that such “suitable safeguards” should include
in any case specific information to the data subject, the right to obtain human intervention, in
16 Article 52(1) of the Charter states that: “[a]ny limitation on the exercise of the rights and freedoms recognised
by this Charter must be provided for by law and respect the essence of those rights and freedoms Subject to the
principle of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of
general interest recognised by the Union or the need to protect the rights and freedoms of others”
17 The Impact Assessment refers to the “risk assessment” and “remedial action plan” in the context of the
implementation of measures under Article 6 pursuant to a risk-based approach However, such requirements
have not been finally introduced in the Proposal
6
4. Mandatory preservation of content and related data by the HSPs
• Pursuant to Article 7, HSPs would be required to
preserve terrorist content (removed or
disabled as a result of any of the three possible sets of actions under the Proposal, i.e,
executing removal orders, referrals or proactively)
and related data20 for the purpose of
subsequent administrative proceedings and judicial review (as a safeguard in cases of
erroneous removal), as well as for the purpose of prevention, detection, investigation or
prosecution of terrorist offences21.
• The EDPS notes that the imposition of such a data retention obligation on HSPs entails that
private entities are required to retain data (including personal data relating to the uploaders
and concerning offences, ‘terrorist offences’, having a criminal law nature) for law
enforcement purposes for the period of six months.22 In this respect, the EDPS recalls that
pursuant to Article 10 of the GDPR the processing of personal data relating to criminal
offences should be carried out only under the control of official authority
or when the
processing is authorised by Union or Member State law providing for
appropriate
safeguards for the rights and freedoms of data subjects.
• Since the processing in question (preservation of terrorist content and related data) is would
not be under the control of official authority, the appropriate level of safeguards to be ensured
is a key issue. The EDPS observes that Article 7(3) provides that HSPs should “ensure that
the terrorist content and related data [...] are subject to appropriate technical and
organisational safeguards” and these “technical and organisational safeguards shall ensure
that the preserved terrorist content and related data is only accessed and processed for the
[relevant] purposes [...] and ensure a high level of security of the personal data concerned.”
• The EPDS recalls that Article 7 of the repealed Directive 2006/24 (hereinafter, ‘the Data
Retention Directive’)23 provided, with a wording similar to the one of the Proposal, that: “the
data shall be subject to appropriate technical and organisational measures to protect the data
against accidental or unlawful destruction, accidental loss or alteration, or unauthorised or
unlawful storage, processing, access or disclosure” and “the data shall be subject to
appropriate technical and organisational measures to ensure that they can be accessed by
specially authorised personnel only”. However, the CJEU concluded, in
Digital Rights
Ireland, that the Data Retention Directive did
not provide sufficient safeguards to ensure
effective protection of the retained data against the risk of abuse, unlawful access and
subsequent use of the data.24
• The EDPS observes that it can be argued that the Proposal, similarly to the Data Retention
Directive, does
not lay down substantive and procedural conditions relating to
the access and
the subsequent use of the preserved data by “competent authorities”, as required by the
20 On the need to define ‘related data’, see considerations made under Section 2 2 of these formal comments
21 See Recital 21
22 In particular Recital 22 provides: “To ensure proportionality, the period of preservation shou ld be limited to
six months to allow the content providers sufficient time to initiate the review process
and to enable law
enforcement access to relevant data for the investigation and prosecution of terrorist offences However,
this period may be prolonged for the period that is necessary in case the review proceedings are initiated
but not finalised within the six months period upon request by the authority carrying out the review. This
duration should be sufficient to allow law enforcement authorities t o preserve the necessary evidence in
relation to investigations, while ensuring the balance with the fundamental rights concerned” (Emphasis added)
23 Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on the retention of
data generated or processed in connection with the provision of publicly available electronic communications
services or of public communications networks and amending Directive 2002/58/EC, OJ L 105, 13 4 2006, p
54-63
24 Joined Cases C-293/12 and C-594/12, Digital Rights Ireland Ltd, see in particular at paras 54-55 and 65-67
8