This is an HTML version of an attachment to the Freedom of Information request 'Working papers and other documents containing positions, comments and text proposals of member states on the Proposal for EU AI Act'.


Brussels, 25 January 2022
WK 868/2022 INIT
LIMITE
TELECOM

WORKING PAPER
This is a paper intended for a specific community of recipients. Handling and
further distribution are under the sole responsibility of community members.
CONTRIBUTION
From:
General Secretariat of the Council
To:
Working Party on Telecommunications and Information Society
Subject:
Artificial Intelligence Act - NL comments (doc. 14278/21)
Delegations will find in annex the NL comments on Artificial Intelligence Act (doc. 14278/21).
WK 868/2022 INIT
LIMITE
EN

 
 
Deadline for comments: 6 January 2022 
Presidency compromise text for Artificial Intelligence Act (doc. 14278/21) 
Comments and drafting suggestions requested on Articles 30-85, Annexes V-IX) 
Important: In order to guarantee that your comments appear accurately, please do not modify the table format by adding/removing/adjusting/merging/splitting cells and rows. This would hinder the 
consolidation of your comments. When adding new provisions, please use the free rows provided for this purpose between the provisions. You can add multiple provisions in one row, if necessary, but do not 
add or remove rows. For drafting suggestions (2nd column), please copy the relevant sentence or sentences from a given paragraph or point into the second column and add or remove text. Please do not use 
track changes, but highlight your additions in yellow or use strikethrough to indicate deletions. You do not need to copy entire paragraphs or points to indicate your changes, copying and modifying the 
relevant sentences is sufficient. For comments on specific provisions, please insert your remarks in the 3rd column in the relevant row. If you wish to make general comments on the entire proposal, please do 
so in the row containing the title of the proposal (in the 3rd column). 

 
 
 
Presidency compromise text 
Drafting Suggestions 
Comments 
Proposal for a REGULATION OF THE 
 
 
EUROPEAN PARLIAMENT AND OF THE 
COUNCIL LAYING DOWN 
HARMONISED RULES ON ARTIFICIAL 
INTELLIGENCE (ARTIFICIAL 
INTELLIGENCE ACT) AND AMENDING 
CERTAIN UNION LEGISLATIVE ACTS 
(Text with EEA relevance) 
 
 
 
CHAPTER 4 
 
 
 
 
 
NOTIFIYING AUTHORITIES AND 
 
 
NOTIFIED BODIES 
 
 
 

 

Presidency compromise text 
Drafting Suggestions 
Comments 
Article 30 
 
 
Notifying authorities 
 
 
 
1. 
Each Member State shall designate or 
 
 
establish a notifying authority responsible for 
setting up and carrying out the necessary 
procedures for the assessment, designation and 
notification of conformity assessment bodies 
and for their monitoring.  
 
 
 
2. 
Member States may designate a national 
 
 
accreditation body referred to in Regulation 
(EC) No 765/2008 as a notifying authority. 
 
 
 
3. 
Notifying authorities shall be established, 
 
 
organised and operated in such a way that no 
conflict of interest arises with conformity 
assessment bodies and the objectivity and 
impartiality of their activities are safeguarded. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
4. 
Notifying authorities shall be organised in   
 
such a way that decisions relating to the 
notification of conformity assessment bodies are 
taken by competent persons different from those 
who carried out the assessment of those bodies. 
 
 
 
5. 
Notifying authorities shall not offer or 
 
 
provide any activities that conformity 
assessment bodies perform or any consultancy 
services on a commercial or competitive basis. 
 
 
 
6. 
Notifying authorities shall safeguard the 
Insofar the information obtained by the 
We propose to first test whether the information 
confidentiality of the information they obtain. 
notifying authorities must be considered 
obtained is indeed confidential. If not, the 
confidential , the notifying authorities shall 
information can be shared freely. If the answer 
safeguard the confidentiality of the information 
is yes, confidentiality must be safeguarded, 
they obtain this information, except when 
except when disclosure is required by Union or 
disclosure is required by Union or national law.  national law. 
 
 
 
7. 
Notifying authorities shall have a 
Member States shall ensure that the notifying  More firm commitments necessary. The word 
sufficient number of competent personnel at 
authorities have a sufficient number of 
’proper’ is too vague. 

Presidency compromise text 
Drafting Suggestions 
Comments 
their disposal for the proper performance of 
competent personnel at their disposal for the 
their tasks. 
proper performance of their tasks. 
 
 
 
8. 
Notifying authorities shall make sure that 
 
 
conformity assessments are carried out in a 
proportionate manner, avoiding unnecessary 
burdens for providers and that notified bodies 
perform their activities taking due account of 
the size of an undertaking, the sector in which it 
operates, its structure and the degree of 
complexity of the AI system in question. 
 
 
 
Article 31 
 
 
Application of a conformity assessment body for 
notification  
 
 
 
1. 
Conformity assessment bodies shall 
 
 
submit an application for notification to the 
notifying authority of the Member State in 
which they are established. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
2. 
The application for notification shall be 
 
 
accompanied by a description of the conformity 
assessment activities, the conformity assessment 
module or modules and the artificial intelligence 
technologies for which the conformity 
assessment body claims to be competent, as well 
as by an accreditation certificate, where one 
exists, issued by a national accreditation body 
attesting that the conformity assessment body 
fulfils the requirements laid down in Article 33. 
Any valid document related to existing 
designations of the applicant notified body 
under any other Union harmonisation legislation 
shall be added.  
 
 
 
3. 
Where the conformity assessment body 
3. Conformity assessment bodies should 
We suggest specifying in Art. 31.3 that 
concerned cannot provide an accreditation 
preferably provide an accreditation certificate to  accreditation should be the starting point. 
certificate, it shall provide the notifying 
demonstrate compliance with the requirements 
Exceptions should only be possible in case of 
authority with the documentary evidence 
laid down in Article 33. Only where the 
compelling reasons. 
necessary for the verification, recognition and 
conformity assessment body concerned cannot 
regular monitoring of its compliance with the 
provide an accreditation certificate, it shall 

Presidency compromise text 
Drafting Suggestions 
Comments 
requirements laid down in Article 33. For 
provide the notifying authority with the 
notified bodies which are designated under any 
documentary evidence necessary for the 
other Union harmonisation legislation, all 
verification, recognition and regular monitoring 
documents and certificates linked to those 
of its compliance with the requirements laid 
designations may be used to support their 
down in Article 33. For notified bodies which 
designation procedure under this Regulation, as 
are designated under any other Union 
appropriate. 
harmonisation legislation, all documents and 
certificates linked to those designations may be 
used to support their designation procedure 
under this Regulation, as appropriate. 
 
 
 
Article 32 
 
There is no paragraph dedicated to the 
Notification procedure 
consequences of an objection and whether this 
triggers the processes described in article 36 or 
article 37.   
 
 
 
1. 
Notifying authorities may notify only 
 
 
conformity assessment bodies which have 
satisfied the requirements laid down in Article 
33.  
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
2. 
Notifying authorities shall notify the 
using the electronic notification tool [reference 
Is this is a new tool based on existing tools in 
Commission and the other Member States using  needed] developed and managed by the 
other areas, please provide a reference to that or 
the electronic notification tool developed and 
Commission. 
explain how this tool will be developed. 
managed by the Commission.  
 
 
 
3. 
The notification shall include full details 
The notification shall include full details of the 
We would suggest adding a paragraph on 
of the conformity assessment activities, the 
conformity assessment activities, the conformity  notifications that are not based on accreditation 
conformity assessment module or modules and 
assessment module or modules and the artificial  certificates in order to better ensure competence 
the artificial intelligence technologies 
intelligence technologies concerned and the 
and compliance with Art. 33. 
concerned.  
relevant attestation of competence. 
Where a notification is not based on an 
accreditation certificate as referred to in Article 31 
(2), the notifying authority shall provide the 
Commission and the other Member States with 
documentary evidence which attests to the 
conformity assessment body's competence and the 
arrangements in place to ensure that that body will 
be monitored regularly and will continue to satisfy 
the requirements laid down in Article 33. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
4. 
The conformity assessment body 
The conformity assessment body concerned 
Suggestion to clarify to which activities this 
concerned may perform the activities of a 
may perform the activities of a notified body as 
article refers, is this article 31?  
notified body only where no objections are 
referred to in article 31/recital 64 only where no  We would suggest making a distinction between 
raised by the Commission or the other Member 
objections are raised by the Commission or the 
notifications based on accreditation certificates 
States within one month of a notification.  
other Member States within one month of a 
and notifications based on other documentary 
notification within two weeks of the notification  evidence. The latter would require more 
where it includes an accreditation certificate 
intensive investigation. 
referred to in Article 31 paragraph 2 or within 
Also we would suggest clarifying that only 
two months of the notification where it includes  assessment bodies that have been notified 
documentary evidence referred to in Article 31 
without any objections being raised may be 
paragraph 3.  
considered notified bodies. 
 
 
 
5. 
Notifying authorities shall notify the 
Notifying authorities shall notify the 
Minor specification. 
Commission and the other Member States of 
Commission and the other Member States of 
any subsequent relevant changes to the 
any subsequent relevant changes to the 
notification. 
notification referred to in paragraph 2. 
 
 
 
Article 33 
 
We noticed that no presumption of conformity 
Notified bodies  
has been introduced in this article for notified 
bodies that demonstrate conformity with 
applicable harmonized standards that cover the 

Presidency compromise text 
Drafting Suggestions 
Comments 
requirements set out in this Regulation. We 
would suggest to consider including such a 
presumption of conformity. 
 
 
 
1. 
Notified bodies shall verify the conformity   
What conformity of high risk AI shall the 
of high-risk AI system in accordance with the 
notified bodies exactly verify (production, use, 
conformity assessment procedures referred to in 
etc.)? 
Article 43. 
 
 
 
2. 
Notified bodies shall satisfy the 
 
 
organisational, quality management, resources 
and process requirements that are necessary to 
fulfil their tasks. 
 
 
 
3. 
The organisational structure, allocation of   
 
responsibilities, reporting lines and operation of 
notified bodies shall be such as to ensure that 
there is confidence in the performance by and in 
the results of the conformity assessment 
activities that the notified bodies conduct. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
4. 
Notified bodies shall be independent of 
Notified bodies shall be independent of the 
Overall independence seems a better way. 
the provider of a high-risk AI system in relation  provider of a high-risk AI system in relation to 
Added ‘’no conflict of interest’’ to make sure 
to which it performs conformity assessment 
which it performs conformity assessment 
that what is not mentioned here is also covered. 
activities. Notified bodies shall also be 
activities. Notified bodies shall ensure there is 
independent of any other operator having an 
no conflict of interest and also be independent 
economic interest in the high-risk AI system 
of any other operator having an economic 
that is assessed, as well as of any competitors of  interest in the high-risk AI system that is 
the provider. 
assessed, as well as of any competitors of the 
provider. 
 
 
 
5. 
Notified bodies shall be organised and 
Notified bodies shall be organised and operated 
Addition to ensure impartiality at top level 
operated so as to safeguard the independence, 
so as to safeguard the independence, objectivity  management. 
objectivity and impartiality of their activities. 
and impartiality of their activities. Notified 
Notified bodies shall document and implement a  bodies shall document and implement a 
structure and procedures to safeguard 
structure and procedures to safeguard 
impartiality and to promote and apply the 
impartiality and to promote and apply the 
principles of impartiality throughout their 
principles of impartiality throughout their 
organisation, personnel and assessment 
organisation, personnel and assessment 
activities. 
activities. The remuneration of the top level 
management and the personnel responsible for 
carrying out the conformity assessments tasks 

Presidency compromise text 
Drafting Suggestions 
Comments 
shall not depend on the number of conformity 
assessments carried out or on the results of those 
assessments. 
 
 
 
6. 
Notified bodies shall have documented 
except in relation to the notifying and national 
Suggestion for minor specification to enable 
procedures in place ensuring that their 
authorities of the Member State in which their 
information sharing with national authorities. 
personnel, committees, subsidiaries, 
activities are carried out. 
subcontractors and any associated body or 
personnel of external bodies respect the 
confidentiality of the information which comes 
into their possession during the performance of 
conformity assessment activities, except when 
disclosure is required by law. The staff of 
notified bodies shall be bound to observe 
professional secrecy with regard to all 
information obtained in carrying out their tasks 
under this Regulation, except in relation to the 
notifying authorities of the Member State in 
which their activities are carried out.  
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
7. 
Notified bodies shall have procedures for 
 
 
the performance of activities which take due 
account of the size of an undertaking, the sector 
in which it operates, its structure, the degree of 
complexity of the AI system in question. 
 
 
 
8. 
Notified bodies shall take out appropriate 
Notified bodies shall take out appropriate 
For clarification:  
liability insurance for their conformity 
liability insurance for their conformity 
Is liability insurance not a matter of 
assessment activities, unless liability is assumed  assessment activities, unless liability is assumed  responsibility of notifying bodies themselves, 
by the Member State concerned in accordance 
by the Member State concerned in accordance 
why is it in the article? 
with national law or that Member State is 
with national law in the notifying Member State 
directly responsible for the conformity 
or that Member State is itself directly 
assessment. 
responsible for the conformity assessment. 
 
 
 
9. 
Notified bodies shall be capable of 
 
 
carrying out all the tasks falling to them under 
this Regulation with the highest degree of 
professional integrity and the requisite 
competence in the specific field, whether those 
tasks are carried out by notified bodies 

Presidency compromise text 
Drafting Suggestions 
Comments 
themselves or on their behalf and under their 
responsibility. 
 
 
 
10.  Notified bodies shall have sufficient 
who possess experience and knowledge relating  In addition to technical knowledge, notified 
internal competences to be able to effectively 
to the relevant artificial intelligence 
bodies should have substantive expertise 
evaluate the tasks conducted by external parties 
technologies, data and data computing, 
relating to fundamental rights, in order to be 
on their behalf. To that end, at all times and for 
fundamental rights, health and safety risks and 
able to appropriately assess whether the manner 
each conformity assessment procedure and each  to the requirements set out in Chapter 2 of this 
in which chapter 2 of this title is applied, 
type of high-risk AI system in relation to which 
Title. 
effectively safeguards against fundamental 
they have been designated, the notified body 
rights risks and health and safety risks. 
shall have permanent availability of sufficient 
administrative, technical and scientific 
personnel who possess experience and 
knowledge relating to the relevant artificial 
intelligence technologies, data and data 
computing and to the requirements set out in 
Chapter 2 of this Title. 
 
 
 
11.  Notified bodies shall participate in 
 
 
coordination activities as referred to in Article 
38. They shall also take part directly or be 

Presidency compromise text 
Drafting Suggestions 
Comments 
represented in European standardisation 
organisations, or ensure that they are aware and 
up to date in respect of relevant standards. 
 
 
 
12.  Notified bodies shall make available and 
 
 
submit upon request all relevant documentation, 
including the providers’ documentation, to the 
notifying authority referred to in Article 30 to 
allow it to conduct its assessment, designation, 
notification, monitoring and surveillance 
activities and to facilitate the assessment 
outlined in this Chapter. 
 
 
 
Article 34 
 
 
Subsidiaries of and subcontracting by notified 
bodies 
 
 
 
1. 
Where a notified body subcontracts 
 
More general question: is it necessary to make 
specific tasks connected with the conformity 
use of subcontracting. If the answer is yes, than 
assessment or has recourse to a subsidiary, it 
describing in which circumstances this is 
shall ensure that the subcontractor or the 

Presidency compromise text 
Drafting Suggestions 
Comments 
subsidiary meets the requirements laid down in 
allowed and/or in which not could give a more 
Article 33 and shall inform the notifying 
clear and transparent picture. 
authority accordingly.  
 
 
 
2. 
Notified bodies shall take full 
 
 
responsibility for the tasks performed by 
subcontractors or subsidiaries wherever these 
are established. 
 
 
 
3. 
Activities may be subcontracted or carried  3. 
Activities may be subcontracted or carried  Given the ongoing discussions about 
out by a subsidiary only with the agreement of 
out by a subsidiary only with the agreement of 
subcontracting we would suggest specifying the 
the provider. 
the provider. The establishment and the 
desired scope of activities that may be 
supervision of internal procedures, general 
subcontracted or carried out by a subsidiary. 
policies, codes of conduct or other internal 
Some activities are not suited for subcontracting 
rules, the assignment of personnel to specific 
in our view. 
tasks and the decision on certification may not 
be delegated to a subcontractor or a subsidiary. 
 
 
 
4. 
Notified bodies shall keep at the disposal 
 
 
of the notifying authority the relevant 
documents concerning the assessment of the 

Presidency compromise text 
Drafting Suggestions 
Comments 
qualifications of the subcontractor or the 
subsidiary and the work carried out by them 
under this Regulation. 
 
 
 
Article 35 
 
 
Identification numbers and lists of notified 
bodies designated under this Regulation 
 
 
 
1. 
The Commission shall assign an 
 
 
identification number to notified bodies. It shall 
assign a single number, even where a body is 
notified under several Union acts. 
 
 
 
2. 
The Commission shall make publicly 
 
 
available the list of the bodies notified under 
this Regulation, including the identification 
numbers that have been assigned to them and 
the activities for which they have been notified. 
The Commission shall ensure that the list is kept 
up to date. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
Article 36 
 
 
Changes to notifications 
 
 
 
1. 
Where a notifying authority has suspicions   
 
or has been informed that a notified body no 
longer meets the requirements laid down in 
Article 33, or that it is failing to fulfil its 
obligations, that authority shall without delay 
investigate the matter with the utmost diligence. 
In that context, it shall inform the notified body 
concerned about the objections raised and give 
it the possibility to make its views known. If the 
notifying authority comes to the conclusion that 
the notified body investigation no longer meets 
the requirements laid down in Article 33 or that 
it is failing to fulfil its obligations, it shall 
restrict, suspend or withdraw the notification as 
appropriate, depending on the seriousness of the 
failure. It shall also immediately inform the 
Commission and the other Member States 
accordingly. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
2. 
In the event of restriction, suspension or 
 
 
withdrawal of notification, or where the notified 
body has ceased its activity, the notifying 
authority shall take appropriate steps to ensure 
that the files of that notified body are either 
taken over by another notified body or kept 
available for the responsible notifying 
authorities at their request. 
 
 
 
Article 37 
 
 
Challenge to the competence of notified bodies 
 
 
 
1. 
The Commission shall, where necessary, 
The Commission shall, where necessary, 
We would prefer a more specific approach in 
investigate all cases where there are reasons to 
investigate all cases where there are reasons to 
order to ensure compliance. 
doubt whether a notified body complies with the  doubt whether a notified body complies with the 
requirements laid down in Article 33. 
requirements laid down in Article 33 and other 
requirements and responsibilities to which it is 
subject. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
2. 
The Notifying authority shall provide the 
 
 
Commission, on request, with all relevant 
information relating to the notification of the 
notified body concerned. 
 
 
 
3. 
The Commission shall ensure that all 
 
 
confidential information obtained in the course 
of its investigations pursuant to this Article is 
treated confidentially. 
 
 
 
4. 
Where the Commission ascertains that a 
 
 
notified body does not meet or no longer meets 
the requirements laid down in Article 33, it shall 
adopt a reasoned decision requesting the 
notifying Member State to take the necessary 
corrective measures, including withdrawal of 
notification if necessary. That implementing act 
shall be adopted in accordance with the 
examination procedure referred to in Article 
74(2). 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
Article 38 
 
 
Coordination of notified bodies 
 
 
 
1. 
The Commission shall ensure that, with 
 
 
regard to the areas covered by this Regulation, 
appropriate coordination and cooperation 
between notified bodies active in the conformity 
assessment procedures of AI systems pursuant 
to this Regulation are put in place and properly 
operated in the form of a sectoral group of 
notified bodies. 
 
 
 
2. 
Member States shall ensure that the bodies   
 
notified by them participate in the work of that 
group, directly or by means of designated 
representatives. 
 
 
 
Article 39 
 
Suggestion to specify the procedure with third 
Conformity assessment bodies of third countries 
countries, referencing to existing agreements 
regarding conformity assessment bodies where 
possible and clarifying how this relates to the 

Presidency compromise text 
Drafting Suggestions 
Comments 
judgment that accreditation of a notified body 
must be done by the national accreditation body 
(located in the European Union) (C-142/20 - 
Analisi G. Caracciolo). 
 
 
 
 
Conformity assessment bodies established under  Conformity assessment bodies established under  Minor suggestion for specification, agreement 
the law of a third country with which the Union  the law of a third country with which the Union  need further elaboration with possible 
has concluded an agreement may be authorised 
has concluded an agreement may be authorised 
references to existing type of agreements. 
to carry out the activities of notified Bodies 
to carry out the activities of notified Bodies 
Furthermore we suggest to include a procedure 
under this Regulation. 
under this Regulation, provided that they meet 
when it is suspected that the notified body in a 
the requirements in Article 33. 
third country does not meet/no longer meets the 
requirements laid down in Article 33. 
 
 
 
 
CHAPTER 5 
 
 
 
 
 
STANDARDS, CONFORMITY 
 
 
ASSESSMENT, CERTIFICATES, 
REGISTRATION 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
Article 40 
 
The standards mentioned here should solely be 
Harmonised standards 
of genuinely technical aspects. The overall 
authority to set standards and perform oversight 
of issues that are not purely technical, such as 
bias mitigation, should remain in the remit of 
the legislative process guaranteeing 
parliamentary scrutiny and multistakeholder 
engagement. 
 
 
 
High-risk AI systems which are in conformity 
 
 
with harmonised standards or parts thereof the 
references of which have been published in the 
Official Journal of the European Union shall be 
presumed to be in conformity with the 
requirements set out in Chapter 2 of this Title, to 
the extent those standards cover those 
requirements. 
 
 
 
Article 41 
 
 
Common specifications 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
1. 
Where harmonised standards referred to in   
 
Article 40 do not exist or where the Commission 
considers that the relevant harmonised standards 
are insufficient or that there is a need to address 
specific safety or fundamental right concerns, 
the Commission may, by means of 
implementing acts, adopt common 
specifications in respect of the requirements set 
out in Chapter 2 of this Title. Those 
implementing acts shall be adopted in 
accordance with the examination procedure 
referred to in Article 74(2). 
 
 
 
2. 
The Commission, when preparing the 
The Commission, when preparing the common 
As common standards and specifications 
common specifications referred to in paragraph 
specifications referred to in paragraph 1, shall 
translate important requirements that aim to 
1, shall gather the views of relevant bodies or 
gather the views of relevant bodies or expert 
protect fundamental rights, member states 
expert groups established under relevant 
groups established under relevant sectorial 
should be at least consulted.  
sectorial Union law. 
Union law, and member states.  
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
3. 
High-risk AI systems which are in 
 
 
conformity with the common specifications 
referred to in paragraph 1 shall be presumed to 
be in conformity with the requirements set out 
in Chapter 2 of this Title, to the extent those 
common specifications cover those 
requirements. 
 
 
 
4. 
Where providers do not comply with the 
 
 
common specifications referred to in paragraph 
1, they shall duly justify that they have adopted 
technical solutions that are at least equivalent 
thereto. 
 
 
 
Article 42 
 
 
Presumption of conformity with certain 
requirements 
 
 
 
1. 
Taking into account their intended 
1. 
Taking into account their intended 
We propose to delete this paragraph because we 
purpose, high-risk AI systems that have been 
purpose, high-risk AI systems that have been 
find that it does not increase clarity in relation to 
trained and tested on data concerning the 
trained and tested on data concerning the 
article 10(4).  

Presidency compromise text 
Drafting Suggestions 
Comments 
specific geographical, behavioural and 
specific geographical, behavioural and 
functional setting within which they are 
functional setting within which they are 
intended to be used shall be presumed to be in 
intended to be used shall be presumed to be in 
compliance with the requirement set out in 
compliance with the requirement set out in 
Article 10(4).  
Article 10(4). 
 
 
 
2. 
High-risk AI systems that have been 
 
 
certified or for which a statement of conformity 
has been issued under a cybersecurity scheme 
pursuant to Regulation (EU) 2019/881 of the 
European Parliament and of the Council1 and 
the references of which have been published in 
the Official Journal of the European Union shall 
be presumed to be in compliance with the 
cybersecurity requirements set out in Article 15 
of this Regulation in so far as the cybersecurity 
certificate or statement of conformity or parts 
thereof cover those requirements. 
 
 
 
                                                 
1 
Regulation (EU) 2019/881 of the European Parliament and of the Council of 17 April 2019 on ENISA (the European Union Agency for Cybersecurity) and on information and 
communications technology cybersecurity certification and repealing Regulation (EU) No 526/2013 (Cybersecurity Act) (OJ L 151, 7.6.2019, p. 1). 

Presidency compromise text 
Drafting Suggestions 
Comments 
Article 43 
 
 
Conformity assessment 
 
 
 
1. 
For high-risk AI systems listed in point 1 
 
 
of Annex III, where, in demonstrating the 
compliance of a high-risk AI system with the 
requirements set out in Chapter 2 of this Title, 
the provider has applied harmonised standards 
referred to in Article 40, or, where applicable, 
common specifications referred to in Article 41, 
the provider shall follow one of the following 
procedures: 
 
 
 
(a)  the conformity assessment procedure 
 
We underline the importance of the 
based on internal control referred to in Annex 
effectiveness of conformity assessments in case 
VI; 
of both self- and third party assessments. We are 
currently considering and studying the option 
that the same procedural requirements for point 
1 of Annex III would also be the preferred 
option for certain other points in Annex III. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(b)  the conformity assessment procedure 
 
 
based on assessment of the quality management 
system and assessment of the technical 
documentation, with the involvement of a 
notified body, referred to in Annex VII. 
 
 
 
Where, in demonstrating the compliance of a 
 
 
high-risk AI system with the requirements set 
out in Chapter 2 of this Title, the provider has 
not applied or has applied only in part 
harmonised standards referred to in Article 40, 
or where such harmonised standards do not exist 
and common specifications referred to in Article 
41 are not available, the provider shall follow 
the conformity assessment procedure set out in 
Annex VII. 
 
 
 
For the purpose of the conformity assessment 
However, when the system is intended to be put  Add judicial authorities in line with the 
procedure referred to in Annex VII, the provider  into service by law enforcement, immigration 
suggested change in article 63. Reference to 
may choose any of the notified bodies. 
or, asylum authorities or judicial authorities as 
article 63 must also be changed accordingly. 
However, when the system is intended to be put  well as EU institutions, bodies or agencies, the 

Presidency compromise text 
Drafting Suggestions 
Comments 
into service by law enforcement, immigration or  market surveillance authority referred to in 
asylum authorities as well as EU institutions, 
Article 63(5) or (6), as applicable, shall act as a 
bodies or agencies, the market surveillance 
notified body. 
authority referred to in Article 63(5) or (6), as 
applicable, shall act as a notified body. 
 
 
 
2. 
For high-risk AI systems referred to in 
 
The CRD doesn’t regulate the offering of 
points 2 to 8 of Annex III, providers shall follow 
(consumer) credit, this is regulated by the CCD 
the conformity assessment procedure based on 
and MCD. Should this article therefore not refer 
internal control as referred to in Annex VI, 
to the CDD and the MCD instead of the CRD? 
which does not provide for the involvement of a 
How does the commission see the difference in 
notified body. For high-risk AI systems referred 
this regulation between banks offering credit 
to in point 5(b) of Annex III, placed on the 
and other non-bank parties that offer credit? 
market or put into service by credit institutions 
regulated by Directive 2013/36/EU, the 
conformity assessment shall be carried out as 
part of the procedure referred to in Articles 97 
to101 of that Directive. 
 
 
 
3. 
For high-risk AI systems, to which legal 
 
 
acts listed in Annex II, section A, apply, the 

Presidency compromise text 
Drafting Suggestions 
Comments 
provider shall follow the relevant conformity 
assessment as required under those legal acts. 
The requirements set out in Chapter 2 of this 
Title shall apply to those high-risk AI systems 
and shall be part of that assessment. Points 4.3., 
4.4., 4.5. and the fifth paragraph of point 4.6 of 
Annex VII shall also apply.  
 
 
 
For the purpose of that assessment, notified 
 
 
bodies which have been notified under those 
legal acts shall be entitled to control the 
conformity of the high-risk AI systems with the 
requirements set out in Chapter 2 of this Title, 
provided that the compliance of those notified 
bodies with requirements laid down in Article 
33(4), (9) and (10) has been assessed in the 
context of the notification procedure under those 
legal acts. 
 
 
 
Where the legal acts listed in Annex II, section 
 
 
A, enable the manufacturer of the product to opt 

Presidency compromise text 
Drafting Suggestions 
Comments 
out from a third-party conformity assessment, 
provided that that manufacturer has applied all 
harmonised standards covering all the relevant 
requirements, that manufacturer may make use 
of that option only if he has also applied 
harmonised standards or, where applicable, 
common specifications referred to in Article 41, 
covering the requirements set out in Chapter 2 
of this Title.  
 
 
 
4. 
High-risk AI systems shall undergo a new   
 
conformity assessment procedure whenever they 
are substantially modified, regardless of whether 
the modified system is intended to be further 
distributed or continues to be used by the 
current user. 
 
 
 
For high-risk AI systems that continue to learn 
For high-risk AI systems that continue to learn 
 
after being placed on the market or put into 
after being placed on the market or put into 
service, changes to the high-risk AI system and 
service, changes to the high-risk AI system and 
its performance that have been pre-determined 
its performance that have been pre-determined 

Presidency compromise text 
Drafting Suggestions 
Comments 
by the provider at the moment of the initial 
by the provider at the moment of the initial 
conformity assessment and are part of the 
conformity assessment and are part of the 
information contained in the technical 
information contained in the technical 
documentation referred to in point 2(f) of Annex  documentation referred to in point 2(f) of Annex 
IV, shall not constitute a substantial 
IV, shall not constitute a substantial 
modification. 
modification, except if they have an impact on 
fundamental rights. 
 
 
 
5. 
The Commission is empowered to adopt 
 
 
delegated acts in accordance with Article 73 for 
the purpose of updating Annexes VI and Annex 
VII in order to introduce elements of the 
conformity assessment procedures that become 
necessary in light of technical progress. 
 
 
 
6. 
The Commission is empowered to adopt 
 
 
delegated acts to amend paragraphs 1 and 2 in 
order to subject high-risk AI systems referred to 
in points 2 to 8 of Annex III to the conformity 
assessment procedure referred to in Annex VII 
or parts thereof. The Commission shall adopt 

Presidency compromise text 
Drafting Suggestions 
Comments 
such delegated acts taking into account the 
effectiveness of the conformity assessment 
procedure based on internal control referred to 
in Annex VI in preventing or minimizing the 
risks to health and safety and protection of 
fundamental rights posed by such systems as 
well as the availability of adequate capacities 
and resources among notified bodies. 
 
 
 
Article 44 
 
 
Certificates 
 
 
 
1. 
Certificates issued by notified bodies in 
 
 
accordance with Annex VII shall be drawn-up 
in an official Union language determined by the 
Member State in which the notified body is 
established or in an official Union language 
otherwise acceptable to the notified body.  
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
2. 
Certificates shall be valid for the period 
 
What is the validity period for the conformity  
they indicate, which shall not exceed five years. 
assessment based on the Annex VI procedure 
On application by the provider, the validity of a 
(internal control)? 
certificate may be extended for further periods, 
each not exceeding five years, based on a re-
assessment in accordance with the applicable 
conformity assessment procedures.  
 
 
 
3. 
Where a notified body finds that an AI 
 
What is ‘an appropriate deadline’ for measures 
system no longer meets the requirements set out 
taken by the provider to take appropriate 
in Chapter 2 of this Title, it shall, taking account 
corrective action? 
of the principle of proportionality, suspend or 
withdraw the certificate issued or impose any 
 
restrictions on it, unless compliance with those 
 
requirements is ensured by appropriate 
corrective action taken by the provider of the 
system within an appropriate deadline set by the 
notified body. The notified body shall give 
reasons for its decision. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
Article 45 
 
 
Appeal against decisions of notified bodies 
 
 
 
Member States shall ensure that an appeal 
 
Suggestion to explicate that civil society 
procedure against decisions of the notified 
organisations are considered as ‘party having a 
bodies is available to parties having a legitimate 
legitimate interest’. 
interest in that decision. 
 
 
 
Article 46 
 
 
Information obligations of notified bodies 
 
 
 
1. 
Notified bodies shall inform the notifying 
 
 
authority of the following:  
 
 
 
(a)  any Union technical documentation 
 
 
assessment certificates, any supplements to 
those certificates, quality management system 
approvals issued in accordance with the 
requirements of Annex VII; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(b)  any refusal, restriction, suspension or 
 
 
withdrawal of a Union technical documentation 
assessment certificate or a quality management 
system approval issued in accordance with the 
requirements of Annex VII;  
 
 
 
(c)  any circumstances affecting the scope of 
 
 
or conditions for notification; 
 
 
 
(d)  any request for information which they 
 
 
have received from market surveillance 
authorities regarding conformity assessment 
activities; 
 
 
 
(e)  on request, conformity assessment 
 
 
activities performed within the scope of their 
notification and any other activity performed, 
including cross-border activities and 
subcontracting. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
2. 
Each notified body shall inform the other 
 
 
notified bodies of: 
 
 
 
(a)  quality management system approvals 
 
 
which it has refused, suspended or withdrawn, 
and, upon request, of quality system approvals 
which it has issued; 
 
 
 
(b)  EU technical documentation assessment 
 
 
certificates or any supplements thereto which it 
has refused, withdrawn, suspended or otherwise 
restricted, and, upon request, of the certificates 
and/or supplements thereto which it has issued. 
 
 
 
3. 
Each notified body shall provide the other   
 
notified bodies carrying out similar conformity 
assessment activities covering the same artificial 
intelligence technologies with relevant 
information on issues relating to negative and, 
on request, positive conformity assessment 
results. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
Article 47 
 
 
Derogation from conformity assessment 
procedure 
 
 
 
1. 
By way of derogation from Article 43, any  (..) for exceptional reasons of public security or 
NL would like to add a definition of ‘key 
market surveillance authority may authorise the  the protection of life and health of persons, 
industrial and infrastructural assets’.  
placing on the market or putting into service of 
environmental protection and the protection of 
specific high-risk AI systems within the territory  key industrial and infrastructural assets, and 
of the Member State concerned, for exceptional  only to the extent that such authorisation is 
reasons of public security or the protection of 
appropriate and necessary. 
life and health of persons, environmental 
protection and the protection of key industrial 
and infrastructural assets. That authorisation 
shall be for a limited period of time, while the 
necessary conformity assessment procedures are 
being carried out, and shall terminate once those 
procedures have been completed. The 
completion of those procedures shall be 
undertaken without undue delay. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
2. 
The authorisation referred to in paragraph   
 
1 shall be issued only if the market surveillance 
authority concludes that the high-risk AI system 
complies with the requirements of Chapter 2 of 
this Title. The market surveillance authority 
shall inform the Commission and the other 
Member States of any authorisation issued 
pursuant to paragraph 1. 
 
 
 
3. 
Where, within 15 calendar days of receipt   
 
of the information referred to in paragraph 2, no 
objection has been raised by either a Member 
State or the Commission in respect of an 
authorisation issued by a market surveillance 
authority of a Member State in accordance with 
paragraph 1, that authorisation shall be deemed 
justified. 
 
 
 
4. 
Where, within 15 calendar days of receipt   
 
of the notification referred to in paragraph 2, 
objections are raised by a Member State against 

Presidency compromise text 
Drafting Suggestions 
Comments 
an authorisation issued by a market surveillance 
authority of another Member State, or where the 
Commission considers the authorisation to be 
contrary to Union law or the conclusion of the 
Member States regarding the compliance of the 
system as referred to in paragraph 2 to be 
unfounded, the Commission shall without delay 
enter into consultation with the relevant 
Member State; the operator(s) concerned shall 
be consulted and have the possibility to present 
their views. In view thereof, the Commission 
shall decide whether the authorisation is 
justified or not. The Commission shall address 
its decision to the Member State concerned and 
the relevant operator or operators. 
 
 
 
5. 
If the authorisation is considered 
 
 
unjustified, this shall be withdrawn by the 
market surveillance authority of the Member 
State concerned. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
6. 
By way of derogation from paragraphs 1 
 
 
to 5, for high-risk AI systems intended to be 
used as safety components of devices, or which 
are themselves devices, covered by Regulation 
(EU) 2017/745 and Regulation (EU) 2017/746, 
Article 59 of Regulation (EU) 2017/745 and 
Article 54 of Regulation (EU) 2017/746 shall 
apply also with regard to the derogation from 
the conformity assessment of the compliance 
with the requirements set out in Chapter 2 of 
this Title. 
 
 
 
Article 48 
 
 
EU declaration of conformity 
 
 
 
1. 
The provider shall draw up a written EU 
A copy of the EU declaration of conformity 
It is important that companies must be able to 
declaration of conformity for each AI system 
shall be given submitted to the relevant national  submit (changes to) data and registrations 
and keep it at the disposal of the national 
competent authorities upon request. 
(digitally) as simple as possible. 
competent authorities for 10 years after the AI 
system has been placed on the market or put 
into service. The EU declaration of conformity 

Presidency compromise text 
Drafting Suggestions 
Comments 
shall identify the AI system for which it has 
been drawn up. A copy of the EU declaration of 
conformity shall be given to the relevant 
national competent authorities upon request. 
 
 
 
2. 
The EU declaration of conformity shall 
 
 
state that the high-risk AI system in question 
meets the requirements set out in Chapter 2 of 
this Title. The EU declaration of conformity 
shall contain the information set out in Annex V 
and shall be translated into an official Union 
language or languages required by the Member 
State(s) in which the high-risk AI system is 
made available.  
 
 
 
3. 
Where high-risk AI systems are subject to   
 
other Union harmonisation legislation which 
also requires an EU declaration of conformity, a 
single EU declaration of conformity shall be 
drawn up in respect of all Union legislations 
applicable to the high-risk AI system. The 

Presidency compromise text 
Drafting Suggestions 
Comments 
declaration shall contain all the information 
required for identification of the Union 
harmonisation legislation to which the 
declaration relates.  
 
 
 
4. 
By drawing up the EU declaration of 
 
 
conformity, the provider shall assume 
responsibility for compliance with the 
requirements set out in Chapter 2 of this Title. 
The provider shall keep the EU declaration of 
conformity up-to-date as appropriate. 
 
 
 
5. 
The Commission shall be empowered to 
 
 
adopt delegated acts in accordance with Article 
73 for the purpose of updating the content of the 
EU declaration of conformity set out in Annex 
V in order to introduce elements that become 
necessary in light of technical progress. 
 
 
 
Article 49 
 
 
CE marking of conformity 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
1. 
The CE marking shall be affixed visibly, 
 
 
legibly and indelibly for high-risk AI systems. 
Where that is not possible or not warranted on 
account of the nature of the high-risk AI system, 
it shall be affixed to the packaging or to the 
accompanying documentation, as appropriate. 
 
 
 
2. 
The CE marking referred to in paragraph 1   
 
of this Article shall be subject to the general 
principles set out in Article 30 of Regulation 
(EC) No 765/2008.  
 
 
 
3. 
Where applicable, the CE marking shall 
 
 
be followed by the identification number of the 
notified body responsible for the conformity 
assessment procedures set out in Article 43. The 
identification number shall also be indicated in 
any promotional material which mentions that 
the high-risk AI system fulfils the requirements 
for CE marking. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
Article 50 
 
 
Document retention 
 
 
 
The provider shall, for a period ending 10 years  The provider shall, for a period ending 10 5 
The keeping of records, documentation and, 
after the AI system has been placed on the 
years after the high-risk AI system has been 
where relevant, data sets should be limited for 
market or put into service, keep at the disposal 
placed on the market or put into service if no 
specific and identified high risk AI-systems only 
of the national competent authorities:  
longer in use, keep at the disposal of the 
and for a limited time to to avoid burdensome 
national competent authorities: 
and costly data storage requirements. 10 year is 
not proportionate for SME’s, suggestion to limit 
the document retention period. 
 
 
 
 
(a)  the technical documentation referred to in   
 
Article 11;  
 
 
 
(b)  the documentation concerning the quality 
 
 
management system referred to Article 17; 
 
 
 
(c)  the documentation concerning the changes   
 
approved by notified bodies where applicable;  

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
(d)  the decisions and other documents issued 
 
 
by the notified bodies where applicable;  
 
 
 
(e)  the EU declaration of conformity referred 
 
 
to in Article 48. 
 
 
 
Article 51 
 
 
Registration 
 
 
 
Before placing on the market or putting into 
 
 
service a high-risk AI system referred to in 
Article 6(2), the provider or, where applicable, 
the authorised representative shall register that 
system in the EU database referred to in Article 
60.  
 
 
 
TITLE IV 
 
 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
TRANSPARENCY OBLIGATIONS 
 
 
FOR CERTAIN AI SYSTEMS 
 
 
 
Article 52 
 
 
Transparency obligations for certain AI systems 
 
 
 
1. 
Providers shall ensure that AI systems 
Providers shall ensure that AI systems intended 
Transparency should be aspired as long as it 
intended to interact with natural persons are 
to interact with natural persons are designed and  does not harm the detection prevention, 
designed and developed in such a way that 
developed in such a way that natural persons are  investigation and prosecution of criminal 
natural persons are informed that they are 
informed that they are interacting with an AI 
offences. At all times, international human 
interacting with an AI system, unless this is 
system, unless this is obvious from the 
rights law must be complied with. 
obvious from the circumstances and the context  circumstances and the context of use. 
of use. This obligation shall not apply to AI 
 
systems authorised by law to detect, prevent, 
This obligation shall not apply to AI systems 
investigate and prosecute criminal offences, 
authorised by law to detect, prevent, investigate 
unless those systems are available for the public  and prosecute criminal offences insofar and as 
to report a criminal offence. 
long as appropriate, proportional and necessary 
for these purposes, also considering any rights 
under international law, Union law or national 
law that can supersede these purposes  

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
2. 
Users of an emotion recognition system or  2. 
Users of an emotion recognition system or  See 52(1). 
a biometric categorisation system shall inform 
a biometric categorisation system shall inform 
of the operation of the system the natural 
of the operation of the system the natural 
persons exposed thereto. This obligation shall 
persons exposed thereto. This obligation shall 
not apply to AI systems used for biometric 
not apply to AI systems used for biometric 
categorisation, which are permitted by law to 
categorisation, which are permitted by law to 
detect, prevent and investigate criminal 
detect, prevent and, investigate and prosecute 
offences. 
criminal offences insofar and as long as 
appropriate, proportional and necessary for 
these purposes also considering any rights under 
international law, Union law or national law that 
can supersede these purposes. 
 
 
 
3. 
Users of an AI system that generates or 
(…) shall disclose that the content has been 
NL wonders why deep fakes are not considered 
manipulates image, audio or video content that 
artificially generated or manipulated in a clear 
a high risk AI system. They can have an impact 
appreciably resembles existing persons, objects,  and visible manner.  
on fundamental rights and security. 
places or other entities or events and would 
 
falsely appear to a person to be authentic or 
NL is in favour of creating transparency labels 
truthful (‘deep fake’), shall disclose that the 
to aid citizens that encounter a ‘deep fake’ to 
help them determine the trustworthiness of the 

Presidency compromise text 
Drafting Suggestions 
Comments 
content has been artificially generated or 
content. We do have some questions about 
manipulated.  
article 52.3: 
-  Will there be a standard / minimum 
requirements to disclose that the content 
has been artificially generated or 
manipulated? Not all such “labels” are as 
effective.  Uniformity can furthermore 
help citizens recognise such labels. 
 
The article should make more clear that it 
requires the producer of such deep fake content 
to disclose that the content has been artificially 
generated or manipulated. 
 
 
 
However, the first subparagraph shall not apply 
However, the first subparagraph shall not apply 
In excersing the right to freedom of expression 
where the use is authorised by law to detect, 
where the use is authorised by law to detect, 
it shall not impade with the rights of others 
prevent, investigate and prosecute criminal 
prevent, investigate and prosecute criminal 
guaranteed to third parties, like the right to 
offences or it is necessary for the exercise of the  offences insofar and as long as appropriate, 
privacy or family life. 
right to freedom of expression and the right to 
proportionate and necessary for these purposes,  
freedom of the arts and sciences guaranteed in 
also considering any rights under international 
the Charter of Fundamental Rights of the EU, 
law, Union law or national law that can 
and subject to appropriate safeguards for the 
supersede these purposes, or it is appropriate 
rights and freedoms of third parties. 
and necessary for the exercise of the right to 

Presidency compromise text 
Drafting Suggestions 
Comments 
freedom of expression and the right to freedom 
of the arts and sciences guaranteed in the 
Charter of Fundamental Rights of the EU, and 
subject to appropriate safeguards for the rights 
and freedoms of third parties. 
 
 
 
4. 
Paragraphs 1, 2 and 3 shall not affect the 
 
 
requirements and obligations set out in Title III 
of this Regulation. 
 
 
 
TITLE IVA 
 
 
 
 
 
GENERAL PURPOSE AI SYSTEMS    
 
 
 
 
Article 52a 
 
 
 
 
 
General purpose AI systems 
 
NL supports the inclusion of general purpose AI 
systems, but needs time to study this specific 
proposal. Also, a specific definition for ‘general 
purpose AI systems’ lacks in the current text. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
1. 
The placing on the market, putting into   
 
service or use of general purpose AI systems 
shall not, by themselves only, make those 
systems subject to the provisions of this 
Regulation. 
 
 
 
2. 
Any person who places on the market 
 
 
or puts into service under its own name or 
trademark or uses a general purpose AI 
system made available on the market or put 
into service for an intended purpose that 
makes it subject to the provisions of this 
Regulation shall be considered the provider 
of the AI system subject to the provisions of 
this Regulation. 
 
 
 
3. 
Paragraph 2 shall apply, mutatis 
 
 
mutandis, to any person who integrates a 
general purpose AI system made available on 
the market, with or without modifying it, into 

Presidency compromise text 
Drafting Suggestions 
Comments 
an AI system whose intended purpose makes 
it subject to the provisions of this Regulation. 
 
 
 
4. 
The provisions of this Article shall 
 
 
apply irrespective of whether the general 
purpose AI system is open source software or 
not.  
 
 
 
TITLE V 
 
 
 
 
 
MEASURES IN SUPPORT OF 
 
 
INNOVATION 
 
 
 
Article 53 
 
 
AI regulatory sandboxes  
 
 
 
1. 
AI regulatory sandboxes established by 
1. AI regulatory sandboxes established by one 
As now formulated, the focus of the sandboxes 
one or more Member States competent 
or more Member States competent authorities or  is on ensuring compliance with the requirements 
authorities or the European Data Protection 
the European Data Protection Supervisor shall 
of this Regulation and other Union and Member 
Supervisor shall provide a controlled 
provide a controlled environment that facilitates  State legislation monitored within the sandbox. 

Presidency compromise text 
Drafting Suggestions 
Comments 
environment that facilitates the development, 
the development, testing and validation of 
It is important to broaden the stated objective of 
testing and validation of innovative AI systems 
innovative AI systems for a limited time before 
the sandboxes, also enabling increased 
for a limited time before their placement on the 
their placement on the market or putting into 
understanding about risks and impacts as well as 
market or putting into service pursuant to a 
service pursuant to a specific plan. This shall 
enhancing legal certainty, and bringing the 
specific plan. This shall take place under the 
take place under the direct supervision and 
article more in line with the stated objectives in 
direct supervision and guidance by the 
guidance by the competent authorities. The 
recital 72.  
competent authorities with a view to ensuring 
sandbox will enable technical, organisational 
Furthermore, to achieve a trusted and expert-
compliance with the requirements of this 
experimentation and legal testing with a view to  driven testing environment, it is important that 
Regulation and, where relevant, other Union and  ensuring compliance with the requirements of 
all competent authorities, including those with 
Member States legislation supervised within the  this Regulation and, where relevant, other Union  domain specific expertise, can be involved in 
sandbox.  
and Member States legislation supervised within  regulatory sandboxes. We suggest to amend the 
the sandbox, enhancing legal certainty as well as  relevant recitals and definitions for increase 
understanding emerging risks and impact of AI-  clarity about the difference between ‘competent 
systems. 
authorities’ including a wide range of 
supervisory organisations and the ‘national 
competent authority’ as defined in article 3. 
 
 
 
2. 
Member States shall ensure that to the 
 
Please clarify that the GDPR applies. 
extent the innovative AI systems involve the 
processing of personal data or otherwise fall 
under the supervisory remit of other national 

Presidency compromise text 
Drafting Suggestions 
Comments 
authorities or competent authorities providing or 
supporting access to data, the national data 
protection authorities and those other national 
authorities are associated to the operation of the 
AI regulatory sandbox. 
 
 
 
3. 
The AI regulatory sandboxes shall not 
The AI regulatory sandboxes shall not affect the  Suspension is sufficient until mitigation, but if it 
affect the supervisory and corrective powers of 
supervisory and corrective powers of the 
doesn’t occur, there should be a possibility to 
the competent authorities. Any significant risks 
competent authorities. Any significant risks to 
end. 
to health and safety and fundamental rights 
health and safety and fundamental rights 
identified during the development and testing of  identified during the development and testing of 
such systems shall result in immediate 
such systems shall result in immediate 
mitigation and, failing that, in the suspension of  mitigation and, failing that, in the suspension or 
the development and testing process until such 
ending of the development and testing process 
mitigation takes place. 
until such mitigation takes place. 
 
 
 
4. 
Participants in the AI regulatory sandbox 
 
 
shall remain liable under applicable Union and 
Member States liability legislation for any harm 
inflicted on third parties as a result from the 
experimentation taking place in the sandbox. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
5. 
Member States’ competent authorities that   
 
have established AI regulatory sandboxes shall 
coordinate their activities and cooperate within 
the framework of the European Artificial 
Intelligence Board. They shall submit annual 
reports to the Board and the Commission on the 
results from the implementation of those 
scheme, including good practices, lessons learnt 
and recommendations on their setup and, where 
relevant, on the application of this Regulation 
and other Union legislation supervised within 
the sandbox.  
 
 
 
6. 
The modalities and the conditions of the 
The modalities and the conditions of the 
 
operation of the AI regulatory sandboxes, 
operation of the AI regulatory sandboxes, 
including the eligibility criteria and the 
including the eligibility criteria and the 
procedure for the application, selection, 
procedure for the application, selection, 
participation and exiting from the sandbox, and 
participation and exiting from the sandbox, the 
the rights and obligations of the participants 
termination of regulatory sandboxes and the 
shall be set out in implementing acts. Those 
rights and obligations of the participants shall be 

Presidency compromise text 
Drafting Suggestions 
Comments 
implementing acts shall be adopted in 
set out in implementing acts. Those 
accordance with the examination procedure 
implementing acts shall be adopted in 
referred to in Article 74(2). 
accordance with the examination procedure 
referred to in Article 74(2). 
 
 
 
Article 54 
 
Questions:  
Further processing of personal data for 
-  Why is the article (and recital 72) related 
developing certain AI systems in the public 
to article 6(4) of the GDPR and related 
interest in the AI regulatory sandbox 
specifically to further processing? Why 
is it not posed as a separate (new) legal 
base? What are the advantages of this 
approach? 
-  Do we assume correctly that the 
proposal only functions as a legal ground 
for personal data, and not for LED-data? 
This with regard to recital 72 and the 
specific mention of the need for a legal 
basis in MS law in article 54(1)(a)(i).  
-  Do we assume correctly that the goal of 
article 54(2) is to make sure that MS can 
still by law limit further processing for 
specific purposes, even though it would 
be allowed under article 54(1)? So MS 
law restricting further processing of e.g. 
certain health data preludes the 
possibility created in article 54(1)(a)(ii)?  

Presidency compromise text 
Drafting Suggestions 
Comments 
Suggestions:  
-  The NL believes that if this article is to 
be seen as a horizontal legal ground for 
processing, it requires further 
specification. This might include (not 
exhaustive):  
o  the categories of data used; (e.g. 
also article 9/10 categories of 
data?) 
o  a limited retention period (the 
duration of the project in the 
sandbox seems to general) 
o  further specification of the goals 
(e.g. public safety and health still 
seems to be rather general) 
-  Explicit reference to the GDPR and its 
requirements (e.g. data protection 
principles, DPIA, security measures of 
article 35) could be made.   
-  Especially if the aim is to process special 
categories of data, we are not yet 
convinced that this horizontal approach 
is feasible. It might be necessary to 
exclude these types of data from the 
scope of the legal ground. 
 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
1. 
In the AI regulatory sandbox personal data   
 
lawfully collected for other purposes shall be 
processed for the purposes of developing and 
testing certain innovative AI systems in the 
sandbox under the following conditions: 
 
 
 
(a)  the innovative AI systems shall be 
 
 
developed for safeguarding substantial public 
interest in one or more of the following areas: 
 
 
 
(i) 
the prevention, investigation, detection or 
 
 
prosecution of criminal offences or the 
execution of criminal penalties, including the 
safeguarding against and the prevention of 
threats to public security, under the control and 
responsibility of the competent authorities. The 
processing shall be based on Member State or 
Union law; 
 
 
 
(ii)  public safety and public health, including 
 
 
disease prevention, control and treatment; 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
(iii)  a high level of protection and 
 
 
improvement of the quality of the environment;  
 
 
 
(b)  the data processed are necessary for 
 
 
complying with one or more of the requirements 
referred to in Title III, Chapter 2 where those 
requirements cannot be effectively fulfilled by 
processing anonymised, synthetic or other non-
personal data; 
 
 
 
(c)  there are effective monitoring mechanisms   
 
to identify if any high risks to the fundamental 
rights of the data subjects may arise during the 
sandbox experimentation as well as response 
mechanism to promptly mitigate those risks and, 
where necessary, stop the processing;  
 
 
 
(d)  any personal data to be processed in the 
 
 
context of the sandbox are in a functionally 
separate, isolated and protected data processing 

Presidency compromise text 
Drafting Suggestions 
Comments 
environment under the control of the 
participants and only authorised persons have 
access to that data;  
 
 
 
(e)  any personal data processed are not be 
 
 
transmitted, transferred or otherwise accessed 
by other parties;  
 
 
 
(f) 
any processing of personal data in the 
 
 
context of the sandbox do not lead to measures 
or decisions affecting the data subjects; 
 
 
 
(g)  any personal data processed in the context   
 
of the sandbox are deleted once the participation 
in the sandbox has terminated or the personal 
data has reached the end of its retention period;  
 
 
 
(h)  the logs of the processing of personal data   
 
in the context of the sandbox are kept for the 
duration of the participation in the sandbox and 
1 year after its termination, solely for the 

Presidency compromise text 
Drafting Suggestions 
Comments 
purpose of and only as long as necessary for 
fulfilling accountability and documentation 
obligations under this Article or other 
application Union or Member States legislation; 
 
 
 
(i) 
complete and detailed description of the 
 
 
process and rationale behind the training, testing 
and validation of the AI system is kept together 
with the testing results as part of the technical 
documentation in Annex IV; 
 
 
 
(j) 
a short summary of the AI project 
 
 
developed in the sandbox, its objectives and 
expected results published on the website of the 
competent authorities. 
 
 
 
2. 
Paragraph 1 is without prejudice to Union   
 
or Member States legislation excluding 
processing for other purposes than those 
explicitly mentioned in that legislation. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
Article 55 
 
NL supports the change from small scale to 
Measures for SME small-scale providers and 
SME.  
users  
 
 
 
1. 
Member States shall undertake the 
1. Member States and the European 
The Netherlands would like to introduce a 
following actions: 
Commission shall undertake the following 
shared effort by member states and the 
actions: 
European Commission to support SMEs across 
the EU. This improves the level playing field for 
SMEs as a result of synchronized guidance.  
 
 
 
(a)  provide small-scale SME providers
(a) ensure that competent authorities and the 
Sandboxes are established by competent 
including and start-ups with priority access to 
EDPS provide small-scale SME providers 
authorities according to article 53 and this 
the AI regulatory sandboxes to the extent that 
including and start-ups with priority access to 
amendment ensures that priority access will be 
they fulfil the eligibility conditions; 
the AI regulatory sandboxes to the extent that 
designed to fit into the modalities and the 
they fulfil the eligibility conditions; 
conditions of the operation of the AI regulatory 
sandboxes that will be established based on the 
procedure in article 53.6. 
 
 
 
(b)  organise specific awareness raising 
(b) organise specific awareness raising activities  This proposal establishes a concrete relation 
activities about the application of this 
about the application of this Regulation and the 
between the AI Act and the important 
opportunities to engage in the European Digital 
opportunities offered to promote and enable 

Presidency compromise text 
Drafting Suggestions 
Comments 
Regulation tailored to the needs of the small-
Innovation Hubs and the Testing and 
innovation via the Digital Europe Programme 
scale SME providers and users; 
Experimentation Facilities under the Digital 
and underlined in the coordinated action plan on 
Europe Programme, tailored to the needs of the 
AI, echoing recital 74.  
small-scale SME providers and users; 
 
 
 
(c)  where appropriate, establish a dedicated 
(c) where appropriate, establish a dedicated 
 
channel for communication with small-scale 
channel for communication with small-scale 
SME providers and user and other innovators to  SME providers and users and other innovators 
provide guidance and respond to queries about 
to provide guidance and respond to queries 
the implementation of this Regulation. 
about the implementation of this Regulation. 
 
 
 
2. 
The specific interests and needs of the 
 
 
small-scale SME providers shall be taken into 
account when setting the fees for conformity 
assessment under Article 43, reducing those fees 
proportionately to their size and market size. 
 
3. The European Commission provides guidance  NL proposes to include some specific services 
to member states to support SMEs with 
by the European Commission to help member 
implementation of this Regulation in the form of  states to support SME providers to implement 
workshops, guidance documents and tools.  
the regulation, increase common understanding 
and promote innovation.  

Presidency compromise text 
Drafting Suggestions 
Comments 
TITLE VI 
 
 
 
 
 
GOVERNANCE 
 
 
 
 
 
CHAPTER 1 
 
 
 
 
 
EUROPEAN ARTIFICIAL INTELLIGENCE 
 
 
BOARD 
 
 
 
Article 56 
 
 
Establishment of the European Artificial 
Intelligence Board 
 
 
 
1. 
A ‘European Artificial Intelligence Board’   
 
(the ‘Board’) is established. 
 
 
 
2. 
The Board shall provide advice and 
 
 
assistance to the Commission in order to: 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(a)  contribute to the effective cooperation of 
 
 
the national supervisory authorities and the 
Commission with regard to matters covered by 
this Regulation; 
 
 
 
(b)  coordinate and contribute to guidance and   
 
analysis by the Commission and the national 
supervisory authorities and other competent 
authorities on emerging issues across the 
internal market with regard to matters covered 
by this Regulation; 
 
 
 
(c)  assist the national supervisory authorities 
 
 
and the Commission in ensuring the consistent 
application of this Regulation. 
 
 
 
Article 57 
 
 
Structure of the Board  
 
 
 
1. 
The Board shall be composed of the 
and the European Data Protection Supervisor. 
In this article is not really clear what the role is 
national supervisory authorities, who shall be 
The EDPS functions as a the competent 
of the EDPS. Therefore, we suggest to refer to 

Presidency compromise text 
Drafting Suggestions 
Comments 
represented by the head or equivalent high-level  authority for their supervision as per article 59.8  article 59.8 and 71, where the role of the EDPS 
official of that authority, and the European Data  and article 71 
is described. 
Protection Supervisor. Other national authorities 
may be invited to the meetings, where the issues 
discussed are of relevance for them. 
 
 
 
2. 
The Board shall adopt its rules of 
The Board shall adopt its rules of procedure by 
In case of a simple majority, there is a major 
procedure by a simple majority of its members, 
a simple qualitfied majority of its members, 
risk that almost half the MS might not agree 
following the consent of the Commission. The 
following the consent of in alligment with the 
with certain guidance. Especially with very 
rules of procedure shall also contain the 
Commission. The rules of procedure shall also 
disputable issues, a simple majority is 
operational aspects related to the execution of 
contain the operational aspects related to the 
undesirable. With a much larger majority, 
the Board’s tasks as listed in Article 58. The 
execution of the Board’s tasks as listed in 
sufficient acceptance of guidance will be 
Board may establish sub-groups as appropriate 
Article 58. The Board may establish sub-groups  received. 
for the purpose of examining specific questions.  as appropriate for the purpose of examining 
specific questions. 
Besides, if we read this  article correctly, the 
article assigns a veto-right to the Commission 
regarding the rules of procedure. This seems to 
go beyond the usual set-up for this type of 
Boards in other regulations. Do we interpret this 
correctly, and if yes, why is this deemed 
necessary? 

Presidency compromise text 
Drafting Suggestions 
Comments 
The rules of procedure will also cover the voting 
rules of the Board, through the above veto-right 
the Commission has the opportunity to de facto 
dictate the voting rules. 
 
 
 
 
3. 
The Board shall be chaired by the 
The Board shall be chaired by the Commission 
The Board is also an advisory board for the EC 
Commission. The Commission shall convene 
one of the NSA ( it can be rotated every 6 
and the Commission is also the secretary. This 
the meetings and prepare the agenda in 
months) . The Commission chair shall convene 
already makes the role of the Commission 
accordance with the tasks of the Board pursuant  the meetings and prepare the agenda in 
towards the Board quite strong. The role of the 
to this Regulation and with its rules of 
accordance with the tasks of the Board pursuant  Board and its independence is better served by a 
procedure. The Commission shall provide 
to this Regulation and with its rules of 
NSA as the chair.  
administrative and analytical support for the 
procedure. The Commission shall provide 
 
activities of the Board pursuant to this 
administrative and analytical support for the 
Regulation. 
activities of the Board pursuant to this 
Regulation 
 
 
 
4. 
The Board may invite external experts and   
 
observers to attend its meetings and may hold 
exchanges with interested third parties to inform 

Presidency compromise text 
Drafting Suggestions 
Comments 
its activities to an appropriate extent. To that 
end the Commission may facilitate exchanges 
between the Board and other Union bodies, 
offices, agencies and advisory groups. 
 
 
 
Article 58 
 
 
Tasks of the Board 
 
 
 
When providing advice and assistance to the 
 
 
Commission in the context of Article 56(2), the 
Board shall in particular: 
 
 
 
(a)  collect and share expertise and best 
 
 
practices among Member States; 
 
 
 
(b)  contribute to uniform administrative 
 
 
practices in the Member States, including for the 
functioning of regulatory sandboxes referred to 
in Article 53; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(c)  issue opinions, recommendations or 
 
 
written contributions on matters related to the 
implementation of this Regulation, in particular 
 
 
 
(i) 
on technical specifications or existing 
 
 
standards regarding the requirements set out in 
Title III, Chapter 2,  
 
 
 
(ii)  on the use of harmonised standards or 
 
 
common specifications referred to in Articles 40 
and 41, 
 
 
 
(iii)  on the preparation of guidance documents,   
 
including the guidelines concerning the setting 
of administrative fines referred to in Article 71.; 
 
 
 
(d)  issue an advisory opinion on the need 
 
 
for amendment of Annex I and Annex III, 
 
including in light of available evidence. 
 
 
 
CHAPTER 2 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
NATIONAL COMPETENT AUTHORITIES 
 
 
 
 
 
Article 59 
 
 
Designation of national competent authorities  
 
 
 
1. 
National competent authorities shall be 
 
 
established or designated by each Member State 
for the purpose of ensuring the application and 
implementation of this Regulation. National 
competent authorities shall be organised so as to 
safeguard the objectivity and impartiality of 
their activities and tasks. 
 
 
 
2. 
Each Member State shall designate a 
 
 
national supervisory authority among the 
national competent authorities. The national 
supervisory authority shall act as notifying 
authority and market surveillance authority 
unless a Member State has organisational and 

Presidency compromise text 
Drafting Suggestions 
Comments 
administrative reasons to designate more than 
one authority. 
 
 
 
3. 
Member States shall inform the 
Member States shall inform the Commission of 
 
Commission of their designation or designations  their designation or designations and, where 
and, where applicable, the reasons for 
applicable, the reasons for designating more 
designating more than one authority.  
than one authority. 
 
 
 
4. 
Member States shall ensure that national 
In particular, national competent authorities 
There needs to be some flexibility regarding 
competent authorities are provided with 
shall have a sufficient number of human 
sharing of expertise between national competent 
adequate financial and human resources to fulfil  resources available whose competences and 
authorities (such as resource pools). Restricting 
their tasks under this Regulation. In particular, 
expertise shall include an in-depth 
human resources to ‘personnel’ and requiring 
national competent authorities shall have a 
understanding of artificial intelligence 
that staff is ‘permanently’ available is 
sufficient number of personnel permanently 
technologies, data and data computing, 
unnecessarily prescriptive. 
available whose competences and expertise 
fundamental rights, health and safety risks and 
shall include an in-depth understanding of 
knowledge of existing standards and legal 
artificial intelligence technologies, data and data  requirements. 
computing, fundamental rights, health and 
safety risks and knowledge of existing standards 
and legal requirements.  
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
5. 
Member States shall report to the 
 
 
Commission on an annual basis on the status of 
the financial and human resources of the 
national competent authorities with an 
assessment of their adequacy. The Commission 
shall transmit that information to the Board for 
discussion and possible recommendations.  
 
 
 
6. 
The Commission shall facilitate the 
 
 
exchange of experience between national 
competent authorities. 
 
 
 
7. 
National competent authorities may 
 
 
provide guidance and advice on the 
implementation of this Regulation, including 
tailored to small-scale SME providers. 
Whenever national competent authorities intend 
to provide guidance and advice with regard to 
an AI system in areas covered by other Union 
legislation, the competent national authorities 
under that Union legislation shall be consulted, 

Presidency compromise text 
Drafting Suggestions 
Comments 
as appropriate. Member States may also 
establish one central contact point for 
communication with operators. 
 
 
 
8. 
When Union institutions, agencies and 
 
 
bodies fall within the scope of this Regulation, 
the European Data Protection Supervisor shall 
act as the competent authority for their 
supervision. 
 
 
 
TITLE VII 
 
 
 
 
 
EU DATABASE FOR STAND-
 
 
ALONE HIGH-RISK AI SYSTEMS 
 
 
 
Article 60 
 
Please specify when the data base should be 
EU database for stand-alone high-risk AI 
filled, modified, etc. and how the 
systems 
responsibilities are arranged in particular with 
regards to the responsibility of the member 
states. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
1. 
The Commission shall, in collaboration 
to in Article 6(2) which are registered in 
Maximum transparency should also be aimed 
with the Member States, set up and maintain a 
accordance with Article 51. This obligation 
for in the context of law enforcement (LEA). 
EU database containing information referred to 
shall not apply to AI systems used by law 
However, the publication of AI-systems used 
in paragraph 2 concerning high-risk AI systems 
enforcement to detect, prevent, investigate and 
within the LEA-context should not be disclosed 
referred to in Article 6(2) which are registered 
prosecute criminal offences including the 
to the public if this could lead to the hindering 
in accordance with Article 51. 
safeguarding against and the prevention of 
of criminal prosecution, ongoing investigations 
threats to public security, under the control and 
etc. (e.g. gaming the system). Examples are 
responsibility of the competent authorities when  tools for specific projects with a limited time 
publication may hinder criminal prosecution or 
scope.  
ongoing investigations, insofar and as long as, 
proportional, appropriate and necessary for 
these purposes. 
 
 
 
 
2. 
The data listed in Annex VIII shall be 
 
For law enforcement authorities, no personal 
entered into the EU database by the providers.  
data should be entered in the database in case 
The Commission shall provide them with 
the LEA is a provider. 
technical and administrative support. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
3. 
Information contained in the EU database 
 
 
shall be accessible to the public. 
 
 
 
4. 
The EU database shall contain personal 
Option 1 
The suggestions for the database not to include 
data only insofar as necessary for collecting and  4. 
The EU database shall contain no personal  any personal data, except for what is listed in 
processing information in accordance with this 
dataexcept for the information as listed in 
Annex VIII follows from the principle of data 
Regulation. That information shall include the 
Annex VIII only insofar as necessary for 
minimalisation from the GDPR.  
names and contact details of natural persons 
collecting and processing information in 
How does the exemption ‘this information shall 
who are responsible for registering the system 
accordance with this Regulation. That 
not be provided for high-risk AI systems in the 
and have the legal authority to represent the 
information shall include the names and contact  areas of law enforcement (…)’ relate to the 
provider. 
details of natural persons who are responsible 
exception regarding tax and customs authorities, 
for registering the system and have the legal 
laid down in preamble no. 38? Publication of 
authority to represent the provider. 
these electronic instructions of use by tax and 
 
custom authorities might encourage misuse 
Option 2 
(‘gaming the system’). 
4. 
The EU database shall contain no personal 
data, only insofar as necessary for collecting and 
processing information in accordance with this 
Regulation. That information which shall 
include the names and contact details of natural 
persons who are responsible for registering the 

Presidency compromise text 
Drafting Suggestions 
Comments 
system and have the legal authority to represent 
the provider. 
 
 
 
5. 
The Commission shall be the controller of   
 
the EU database. It shall also ensure to 
providers adequate technical and administrative 
support. 
 
 
 
TITLE VIII 
 
 
 
 
 
POST-MARKET MONITORING, 
 
 
INFORMATION SHARING, 
MARKET SURVEILLANCE 
 
 
 
CHAPTER 1 
 
 
 
 
 
POST-MARKET MONITORING 
 
 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
Article 61 
 
For post-market monitoring, it is important that 
Post-market monitoring by providers and post-
the impact and feasibility of the proposed 
market monitoring plan for high-risk AI systems 
obligation is clear. Oftentimes, AI systems, 
especially when integrated in a product, is hard 
to monitor. Against this background, a post-
market monitoring obligation laid down in art. 
61 should focus on requirements that are 
necessary to provide appropriate level of 
protection against risks and can be met by a 
provider of high-risk AI systems. 
 
 
 
1. 
Providers shall establish and document a 
 
 
post-market monitoring system in a manner that 
is proportionate to the nature of the artificial 
intelligence technologies and the risks of the 
high-risk AI system. 
 
 
 
2. 
The post-market monitoring system shall 
 
 
actively and systematically collect, document 
and analyse relevant data provided by users or 
collected through other sources on the 

Presidency compromise text 
Drafting Suggestions 
Comments 
performance of high-risk AI systems throughout 
their lifetime, and allow the provider to evaluate 
the continuous compliance of AI systems with 
the requirements set out in Title III, Chapter 2. 
 
 
 
3. 
The post-market monitoring system shall 
 
 
be based on a post-market monitoring plan. The 
post-market monitoring plan shall be part of the 
technical documentation referred to in Annex 
IV. The Commission shall adopt an 
implementing act laying down detailed 
provisions establishing a template for the post-
market monitoring plan and the list of elements 
to be included in the plan. 
 
 
 
4. 
For high-risk AI systems covered by the 
 
 
legal acts referred to in Annex II, where a post-
market monitoring system and plan is already 
established under that legislation, the elements 
described in paragraphs 1, 2 and 3 shall be 

Presidency compromise text 
Drafting Suggestions 
Comments 
integrated into that system and plan as 
appropriate. 
 
 
 
The first subparagraph shall also apply to high-
 
 
risk AI systems referred to in point 5(b) of 
Annex III placed on the market or put into 
service by credit institutions regulated by 
Directive 2013/36/EU. 
 
 
 
CHAPTER 2 
 
 
 
 
 
SHARING OF INFORMATION ON SERIOUS 
 
 
INCIDENTS AND MALFUNCTIONING 
 
 
 
Article 62 
 
 
Reporting of serious incidents and of 
malfunctioning 
 
 
 
1. 
Providers of high-risk AI systems placed 
1. 
Providers of high-risk AI systems placed 
To be consistent and clear, and the reference to 
on the Union market shall report any serious 
on the Union market shall report any serious 
Article 3(44)(c) is also made in paragraph 2. 

Presidency compromise text 
Drafting Suggestions 
Comments 
incident or any malfunctioning of those systems  incident referred to in Article 3(44)(c), 
which constitutes a breach of obligations under 
including incidents involving a violation of 
Union law intended to protect fundamental 
fundamental rights’ or any malfunctioning of 
rights to the market surveillance authorities of 
those systems which constitutes a breach of 
the Member States where that incident or breach  obligations under Union law intended to protect 
occurred.  
fundamental rights to the market surveillance 
authorities of the Member States where that 
incident or breach occurred. 
 
 
 
Such notification shall be made immediately 
 
 
after the provider has established a causal link 
between the AI system and the serious incident 
or malfunctioning or the reasonable likelihood 
of such a link, and, in any event, not later than 
15 days after the providers becomes aware of 
the serious incident or of the malfunctioning. 
 
 
 
2. 
Upon receiving a notification related to a 
 
 
serious incident referred to in Article 3(44)(c) 
a breach of obligations under Union law 
intended to protect fundamental rights, the 

Presidency compromise text 
Drafting Suggestions 
Comments 
relevant market surveillance authority shall 
inform the national public authorities or bodies 
referred to in Article 64(3). The Commission 
shall develop dedicated guidance to facilitate 
compliance with the obligations set out in 
paragraph 1. That guidance shall be issued 12 
months after the entry into force of this 
Regulation, at the latest. 
 
 
 
3. 
For high-risk AI systems referred to in 
 
 
point 5(b) of Annex III which are placed on the 
market or put into service by providers that are 
credit institutions regulated by Directive 
2013/36/EU and for high-risk AI systems which 
are safety components of devices, or are 
themselves devices, covered by Regulation (EU) 
2017/745 and Regulation (EU) 2017/746, the 
notification of serious incidents or 
malfunctioning shall be limited to those 
referred to in Article 3(44)(c)that that 

Presidency compromise text 
Drafting Suggestions 
Comments 
constitute a breach of obligations under Union 
law intended to protect fundamental rights. 
 
 
 
CHAPTER 3 
 
 
 
 
 
ENFORCEMENT  
 
 
 
 
 
Article 63 
 
 
Market surveillance and control of AI systems in 
the Union market 
 
 
 
1. 
Regulation (EU) 2019/1020 shall apply to   
Does “ this Regulation” refer to the AI-Act? 
AI systems covered by this Regulation. 
Does “AI-systems” this mean that  Regulation 
However, for the purpose of the effective 
(EU) 2019/1020 is applicable to all AI-systems 
enforcement of this Regulation: 
covered by the AI-Act (and not high risk 
systems only)? 
 
 
 
(a)  any reference to an economic operator 
 
 
under Regulation (EU) 2019/1020 shall be 
understood as including all operators identified 

Presidency compromise text 
Drafting Suggestions 
Comments 
in Title III, Chapter 3 Article 2 of this 
Regulation; 
 
 
 
(b)  any reference to a product under 
 
 
Regulation (EU) 2019/1020 shall be understood 
as including all AI systems falling within the 
scope of this Regulation. 
 
 
 
2. 
The national supervisory authority shall 
 
First sentence requires clarification.  
report to the Commission on a regular basis the 
outcomes of relevant market surveillance 
activities. The national supervisory authority 
shall report, without delay, to the Commission 
and relevant national competition authorities 
any information identified in the course of 
market surveillance activities that may be of 
potential interest for the application of Union 
law on competition rules. 
 
 
 
3. 
For high-risk AI systems, related to 
 
 
products to which legal acts listed in Annex II, 

Presidency compromise text 
Drafting Suggestions 
Comments 
section A apply, the market surveillance 
authority for the purposes of this Regulation 
shall be the authority responsible for market 
surveillance activities designated under those 
legal acts. 
 
 
 
4. 
For AI systems placed on the market, put 
 
 
into service or used by financial institutions 
regulated by Union legislation on financial 
services, the market surveillance authority for 
the purposes of this Regulation shall be the 
relevant authority responsible for the financial 
supervision of those institutions under that 
legislation. 
 
 
 
5. 
For AI systems listed in point 1(a) in so 
For AI systems systems listed in point 1(a) in so  NL proposes here to use a similar formulation as 
far as the systems are used for law enforcement 
far as the systems are used for law enforcement 
for par. 4 
purposes, points 6 and 7 of Annex III, Member 
purposes, points 6 and 7 of Annex III, Member 
States shall designate as market surveillance 
States shall designate as market surveillance 
authorities for the purposes of this Regulation 
authorities placed on the market, put into service 
either the competent data protection supervisory  or used by law enforcement, immigration or 

Presidency compromise text 
Drafting Suggestions 
Comments 
authorities under Directive (EU) 2016/680, or 
asylum authorities agencies, the market 
Regulation 2016/679 or the national competent 
surveillance authorities for the purposes of this 
authorities supervising the activities of the law 
Regulation shall be either the competent data 
enforcement, immigration or asylum authorities  protection supervisory authorities under 
putting into service or using those systems. 
Directive (EU) 2016/680, or Regulation 
2016/679 or the national competent authorities 
supervising the activities of law enforcement, 
immigration or asylum authorities putting into 
service or using those systems. 
 
 
 
6. 
Where Union institutions, agencies and 
6. For AI systems placed on the market, put into   
bodies fall within the scope of this Regulation, 
service or used by judicial authorities, the 
We suggest to add extra paragraph to protect the 
the European Data Protection Supervisor shall 
market surveillance authorities for the purpose 
independency of the judiciary 
act as their market surveillance authority. 
of this Regulation shall be the national 
 
competent authority supervising the activities of 
the judicial authorities 
 
 
 
7. 
Member States shall facilitate the 
 
 
coordination between market surveillance 
authorities designated under this Regulation and 
other relevant national authorities or bodies 

Presidency compromise text 
Drafting Suggestions 
Comments 
which supervise the application of Union 
harmonisation legislation listed in Annex II or 
other Union legislation that might be relevant 
for the high-risk AI systems referred to in 
Annex III. 
 
 
 
Article 64 
 
 
Access to data and documentation 
 
 
 
1. 
Access to data and documentation in the 
Access to data and documentation in the context  The AI Act should not provide for an unlimited 
context of their activities, the market 
of their activities, where strictly necessary for 
legal basis for sharing personal data with 
surveillance authorities shall be granted full 
their task the market surveillance authorities 
surveillance authorities. This should be limited 
access to the training, validation and testing 
shall be granted full access to the training, 
to their respective tasks. Additional safeguards 
datasets used by the provider, including through  validation and testing datasets used by the 
should be in place in case of personal data used 
application programming interfaces (‘API’) or 
provider, including through application 
by LEA’s. 
other appropriate technical means and tools 
programming interfaces (‘API’) or other 
 
enabling remote access. 
appropriate technical means and tools enabling 
The access to data in article 64(1) and 64(2) 
remote access. Additional safeguards or 
seems rather restricted, in practice access to 
restrictions may be in place in case these 
other information than the data sources 
datasets are used to detect, prevent, investigate 
highlighted here might be necessary. The 
wording needs to provide room for authorities to 

Presidency compromise text 
Drafting Suggestions 
Comments 
and prosecute criminal offences insofar and as 
access all information necessary for their tasks, 
long as necessary for these purposes. 
while at the same time acknowledging that 
unnecessary sharing of operational data should 
be avoided. It would be good to stress that the 
amount of information requested should be 
proportionate to the risks involved and take 
account of the size of the organisation. 
 
 
 
2. 
Where necessary to assess the conformity 
 
See above, in practice access to other types of 
of the high-risk AI system with the requirements 
information might be needed just as well. This 
set out in Title III, Chapter 2 and upon a 
seems very limitative. 
reasoned request, the market surveillance 
authorities shall be granted access to the source 
code of the AI system. 
 
 
 
3. 
National public authorities or bodies 
 
 
which supervise or enforce the respect of 
obligations under Union law protecting 
fundamental rights in relation to the use of high-
risk AI systems referred to in Annex III shall 
have the power to request and access any 

Presidency compromise text 
Drafting Suggestions 
Comments 
documentation created or maintained under this 
Regulation when access to that documentation is 
necessary for the fulfilment of the competences 
under their mandate within the limits of their 
jurisdiction. The relevant public authority or 
body shall inform the market surveillance 
authority of the Member State concerned of any 
such request. 
 
 
 
4. 
By 3 months after the entering into force 
 
 
of this Regulation, each Member State shall 
identify the public authorities or bodies referred 
to in paragraph 3 and make a list publicly 
available on the website of the national 
supervisory authority. Member States shall 
notify the list to the Commission and all other 
Member States and keep the list up to date.  
 
 
 
5. 
Where the documentation referred to in 
 
 
paragraph 3 is insufficient to ascertain whether a 
breach of obligations under Union law intended 

Presidency compromise text 
Drafting Suggestions 
Comments 
to protect fundamental rights has occurred, the 
public authority or body referred to paragraph 3 
may make a reasoned request to the market 
surveillance authority to organise testing of the 
high-risk AI system through technical means. 
The market surveillance authority shall organise 
the testing with the close involvement of the 
requesting public authority or body within 
reasonable time following the request.  
 
 
 
6. 
Any information and documentation 
 
 
obtained by the national public authorities or 
bodies referred to in paragraph 3 pursuant to the 
provisions of this Article shall be treated in 
compliance with the confidentiality obligations 
set out in Article 70. 
 
NEW Article 64A 
The current proposal currently lacks any 
Right to Complain 
inclusion of end/natural persons in its 
1. Without prejudice to any other administrative  provisions, as redress is left to pending 
or judicial remedy, every natural person 
proposals and domain specific regulation. 
exposed to an AI system shall have the right to 
However, to increase legal protection and 
lodge a complaint with a supervisory authority, 

Presidency compromise text 
Drafting Suggestions 
Comments 
in particular in the Member State of his or her 
strenthen governance, a right to complain is 
habitual residence, place of work or place of the  necessary and inspired by the GDPR.  
alleged infringement if the data subject 
considers that the use of AI systems affecting 
him or her infringes this Regulation or poses a 
serious risk to his or her fundamental rights. 
2. The supervisory authority with which the 
complaint has been lodged shall inform the 
complainant on the progress and the outcome of 
the complaint including the possibility of a 
judicial remedy. 
Article 65 
 
We presume a high risk AI-system will be 
Procedure for dealing with AI systems 
evaluated against Title II, chapter 2 and 3. But 
presenting a risk at national level 
what would the evaluation criteria be for a) 
prohibited systems (b) AI-systems meant in art. 
52 (c) any other low/no risk systems AI-system? 
 
 
 
1. 
AI systems presenting a risk shall be 
AI systems presenting a risk shall be understood  This is the first time the notion of risk is 
understood as a product presenting a risk 
as a product presenting a risk defined in Article 
explained a bit more. Article 3.19 of Reg 
defined in Article 3, point 19 of Regulation 
3, point 19 of Regulation (EU) 2019/1020 
2019/1020 does not mention fundamental rights, 
(EU) 2019/1020 insofar as risks to the health or 
insofar as risks to the health or safety and as 
so it seems to be insufficient for the purposes of 
product presenting a risk to the protection of 
this regulation. We propose to include a more 

Presidency compromise text 
Drafting Suggestions 
Comments 
safety or to the protection of fundamental rights  fundamental rights of persons are concerned in 
fitting description of product presenting a risk in 
of persons are concerned. 
article 3. […]. 
the definitions part of the regulation, which 
gives the appropriate attention to fundamental 
rights risks, as well as risks for harms at a 
societal level, rather than an individual level. 
 
 
 
2. 
Where the market surveillance authority 
When risks to the protection of fundamental 
 
of a Member State has sufficient reasons to 
rights are present, 
consider that an AI system presents a risk as 
referred to in paragraph 1, they shall carry out 
an evaluation of the AI system concerned in 
respect of its compliance with all the 
requirements and obligations laid down in this 
Regulation. When risks to the protection of 
fundamental rights are present, the market 
surveillance authority shall also inform the 
relevant national public authorities or bodies 
referred to in Article 64(3). The relevant 
operators shall cooperate as necessary with the 
market surveillance authorities and the other 

Presidency compromise text 
Drafting Suggestions 
Comments 
national public authorities or bodies referred to 
in Article 64(3). 
 
 
 
Where, in the course of that evaluation, the 
Where, in the course of that evaluation, the 
If the AI system cannot be corrected it should be 
market surveillance authority finds that the AI 
market surveillance authority finds that the AI 
able to withdraw it.   
system does not comply with the requirements 
system does not comply with the requirements 
and obligations laid down in this Regulation, it 
and obligations laid down in this Regulation, it 
shall without delay require the relevant operator  shall without delay require the relevant operator 
to take all appropriate corrective actions to bring  to take all appropriate corrective actions to bring 
the AI system into compliance, to withdraw the 
the AI system into compliance, to withdraw the 
AI system from the market, or to recall it within  AI system from the market, or to recall it within 
a reasonable period, commensurate with the 
a reasonable period or withdraw it, 
nature of the risk, as it may prescribe. 
commensurate with the nature of the risk, as it 
may prescribe. 
 
 
 
The market surveillance authority shall inform 
 
 
the relevant notified body accordingly. Article 
18 of Regulation (EU) 2019/1020 shall apply to 
the measures referred to in the second 
subparagraph. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
3. 
Where the market surveillance authority 
Where the market surveillance authority 
To prevent long delays/ lack of information or 
considers that non-compliance is not restricted 
considers that non-compliance is not restricted 
doubling of evaluations.  
to its national territory, it shall inform the 
to its national territory, it shall inform the 
 
Commission and the other Member States of the  Commission and the other Member States 
results of the evaluation and of the actions 
within a reasonable time of the results of the 
which it has required the operator to take. 
evaluation and of the actions which it has 
required the operator to take. 
 
 
 
4. 
The operator shall ensure that all 
 
 
appropriate corrective action is taken in respect 
of all the AI systems concerned that it has made 
available on the market throughout the Union. 
 
 
 
5. 
Where the operator of an AI system does 
 
 
not take adequate corrective action within the 
period referred to in paragraph 2, the market 
surveillance authority shall take all appropriate 
provisional measures to prohibit or restrict the 
AI system's being made available on its national 
market, to withdraw the product from that 
market or to recall it. That authority shall inform 

Presidency compromise text 
Drafting Suggestions 
Comments 
the Commission and the other Member States, 
without delay, of those measures. 
 
 
 
6. 
The information referred to in paragraph 5   
 
shall include all available details, in particular 
the data necessary for the identification of the 
non-compliant AI system, the origin of the AI 
system, the nature of the non-compliance 
alleged and the risk involved, the nature and 
duration of the national measures taken and the 
arguments put forward by the relevant operator. 
In particular, the market surveillance authorities 
shall indicate whether the non-compliance is 
due to one or more of the following: 
 
 
 
(a)  a failure of the AI system to meet 
 
 
requirements set out in Title III, Chapter 2;  
 
 
 
(b)  shortcomings in the harmonised standards   
 
or common specifications referred to in Articles 

Presidency compromise text 
Drafting Suggestions 
Comments 
40 and 41 conferring a presumption of 
conformity. 
 
 
 
7. 
The market surveillance authorities of the 
 
 
Member States other than the market 
surveillance authority of the Member State 
initiating the procedure shall without delay 
inform the Commission and the other Member 
States of any measures adopted and of any 
additional information at their disposal relating 
to the non-compliance of the AI system 
concerned, and, in the event of disagreement 
with the notified national measure, of their 
objections. 
 
 
 
8. 
Where, within three months of receipt of 
 
 
the information referred to in paragraph 5, no 
objection has been raised by either a Member 
State or the Commission in respect of a 
provisional measure taken by a Member State, 
that measure shall be deemed justified. This is 

Presidency compromise text 
Drafting Suggestions 
Comments 
without prejudice to the procedural rights of the 
concerned operator in accordance with Article 
18 of Regulation (EU) 2019/1020.  
 
 
 
9. 
The market surveillance authorities of all 
 
 
Member States shall ensure that appropriate 
restrictive measures are taken in respect of the 
product concerned, such as withdrawal of the 
product from their market, without delay. 
 
 
 
Article 66 
 
 
Union safeguard procedure 
 
 
 
1. 
Where, within three months of receipt of 
 
What happens with the system during this time 
the notification referred to in Article 65(5), 
of talks and investigations? 
objections are raised by a Member State against 
Regarding: “objections are raised by a Member 
a measure taken by another Member State, or 
State against a measure taken by another 
where the Commission considers the measure to 
Member State, or where the Commission 
be contrary to Union law, the Commission shall 
considers the measure to be contrary to Union 
without delay enter into consultation with the 
law, the Commission shall without delay enter 
relevant Member State and operator or operators 

Presidency compromise text 
Drafting Suggestions 
Comments 
and shall evaluate the national measure. On the 
into consultation with the relevant Member 
basis of the results of that evaluation, the 
State and operator or operators and shall 
Commission shall decide whether the national 
evaluate the national measure.” 
measure is justified or not within 9 months from 
There can be two situations: another member 
the notification referred to in Article 65(5) and 
state does not agree with the decision of the 
notify such decision to the Member State 
member state to approve the AI system, or to 
concerned.  
disapprove the AI system. However, Article 66 
(2) and (3) only covers the latter situation. What 
if other member states object to an admission of 
an AI system to the internal market by another 
member state? Is this situation deliberately left 
out? Is there a remedy against the decision of 
the Commission? 
 
 
 
 
 
2. 
If the national measure is considered 
 
 
justified, all Member States shall take the 
measures necessary to ensure that the non-
compliant AI system is withdrawn from their 
market, and shall inform the Commission 

Presidency compromise text 
Drafting Suggestions 
Comments 
accordingly. If the national measure is 
considered unjustified, the Member State 
concerned shall withdraw the measure. 
 
 
 
3. 
Where the national measure is considered 
 
 
justified and the non-compliance of the AI 
system is attributed to shortcomings in the 
harmonised standards or common specifications 
referred to in Articles 40 and 41 of this 
Regulation, the Commission shall apply the 
procedure provided for in Article 11 of 
Regulation (EU) No 1025/2012. 
 
 
 
Article 67 
 
 
Compliant AI systems which present a risk 
 
 
 
1. 
Where, having performed an evaluation 
to the compliance with obligations under Union  It is unclear what is meant with ‘obligations 
under Article 65, the market surveillance 
or national law intended to protect fundamental 
under Union or national law intended to protect 
authority of a Member State finds that although 
rights 
fundamental rights’. Does this for instance 
an AI system is in compliance with this 
cover the Charter of Fundamental Rights? Also, 
Regulation, it presents a risk to the health or 
there may be (unacceptable) risks to 

Presidency compromise text 
Drafting Suggestions 
Comments 
safety of persons, to the compliance with 
fundamental rights which are not yet covered by 
obligations under Union or national law 
national or Union law. 
intended to protect fundamental rights or to 
other aspects of public interest protection, it 
shall require the relevant operator to take all 
appropriate measures to ensure that the AI 
system concerned, when placed on the market or 
put into service, no longer presents that risk, to 
withdraw the AI system from the market or to 
recall it within a reasonable period, 
commensurate with the nature of the risk, as it 
may prescribe. 
 
 
 
2. 
The provider or other relevant operators 
 
 
shall ensure that corrective action is taken in 
respect of all the AI systems concerned that they 
have made available on the market throughout 
the Union within the timeline prescribed by the 
market surveillance authority of the Member 
State referred to in paragraph 1. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
3. 
The Member State shall immediately 
 
 
inform the Commission and the other Member 
States. That information shall include all 
available details, in particular the data necessary 
for the identification of the AI system 
concerned, the origin and the supply chain of 
the AI system, the nature of the risk involved 
and the nature and duration of the national 
measures taken. 
 
 
 
4. 
The Commission shall without delay enter   
 
into consultation with the Member States and 
the relevant operator and shall evaluate the 
national measures taken. On the basis of the 
results of that evaluation, the Commission shall 
decide whether the measure is justified or not 
and, where necessary, propose appropriate 
measures. 
 
 
 
5. 
The Commission shall address its decision   
 
to the Member States. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
Article 68 
 
 
Formal non-compliance 
 
 
 
1. 
Where the market surveillance authority 
 
 
of a Member State makes one of the following 
findings, it shall require the relevant provider to 
put an end to the non-compliance concerned: 
 
 
 
(a)  the conformity marking has been affixed 
 
 
in violation of Article 49; 
 
 
 
(b)  the conformity marking has not been 
 
 
affixed; 
 
 
 
(c)  the EU declaration of conformity has not 
 
 
been drawn up; 
 
 
 
(d)  the EU declaration of conformity has not 
 
 
been drawn up correctly; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(e)  the identification number of the notified 
 
 
body, which is involved in the conformity 
assessment procedure, where applicable, has not 
been affixed; 
 
 
 
2. 
Where the non-compliance referred to in 
 
 
paragraph 1 persists, the Member State 
concerned shall take all appropriate measures to 
restrict or prohibit the high-risk AI system being 
made available on the market or ensure that it is 
recalled or withdrawn from the market. 
 
 
 
TITLE IX 
 
 
 
 
 
CODES OF CONDUCT 
 
 
 
 
 
Article 69 
 
 
Codes of conduct 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
1. 
The Commission and the Member States 
 
 
shall encourage and facilitate the drawing up of 
codes of conduct intended to foster the 
voluntary application to AI systems other than 
high-risk AI systems of the requirements set out 
in Title III, Chapter 2 on the basis of technical 
specifications and solutions that are appropriate 
means of ensuring compliance with such 
requirements in light of the intended purpose of 
the systems.  
 
 
 
2. 
The Commission and the Board shall 
 
 
encourage and facilitate the drawing up of codes 
of conduct intended to foster the voluntary 
application to AI systems of requirements 
related for example to environmental 
sustainability, accessibility for persons with a 
disability, stakeholders participation in the 
design and development of the AI systems and 
diversity of development teams on the basis of 

Presidency compromise text 
Drafting Suggestions 
Comments 
clear objectives and key performance indicators 
to measure the achievement of those objectives. 
 
 
 
3. 
Codes of conduct may be drawn up by 
 
 
individual providers of AI systems or by 
organisations representing them or by both, 
including with the involvement of users and any 
interested stakeholders and their representative 
organisations. Codes of conduct may cover one 
or more AI systems taking into account the 
similarity of the intended purpose of the 
relevant systems. 
 
 
 
4. 
The Commission and the Board shall take   
 
into account the specific interests and needs of 
the small-scale SME providers, including and 
start-ups, when encouraging and facilitating the 
drawing up of codes of conduct. 
 
 
 
TITLE X 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
CONFIDENTIALITY AND 
 
 
PENALTIES  
 
 
 
Article 70 
 
 
Confidentiality 
 
 
 
1. 
National competent authorities and 
 
 
notified bodies involved in the application of 
this Regulation shall respect the confidentiality 
of information and data obtained in carrying out 
their tasks and activities in such a manner as to 
protect, in particular: 
 
 
 
(a)  intellectual property rights, and 
 
 
confidential business information or trade 
secrets of a natural or legal person, including 
source code, except the cases referred to in 
Article 5 of Directive 2016/943 on the 
protection of undisclosed know-how and 

Presidency compromise text 
Drafting Suggestions 
Comments 
business information (trade secrets) against their 
unlawful acquisition, use and disclosure apply.  
 
 
 
(b)  the effective implementation of this 
 
 
Regulation, in particular for the purpose of 
inspections, investigations or audits;(c) public 
and national security interests;  
 
 
 
(c)  integrity of criminal or administrative 
integrity of criminal investigations and or 
 
proceedings. 
administrative proceedings. 
 
 
 
2. 
Without prejudice to paragraph 1, 
Without prejudice to paragraph 1, information 
Supervision might go further than only the 
information exchanged on a confidential basis 
exchanged on a confidential basis between the 
category of high risk AI systems. We propose to 
between the national competent authorities and 
national competent authorities and between 
delete the reference to Annex III, the exception 
between national competent authorities and the 
national competent authorities and the 
for law enforcement would then still be intact. 
Commission shall not be disclosed without the 
Commission shall not be disclosed without the 
prior consultation of the originating national 
prior consultation of the originating national 
competent authority and the user when high-risk  competent authority and the user when high-risk 
AI systems referred to in points 1, 6 and 7 of 
AI systems referred to in points 1, 6 and 7 of 
Annex III are used by law enforcement, 
Annex III are used by law enforcement, 
immigration or asylum authorities, when such 
immigration or asylum authorities, when such 

Presidency compromise text 
Drafting Suggestions 
Comments 
disclosure would jeopardise public and national  disclosure would jeopardise public and or 
security interests. 
national security interests or jeopardise the 
detection, prevention, investigation and 
prosecution of criminal offences, including the 
safeguarding against and the prevention of 
threats to public security. 
 
 
 
When the law enforcement, immigration or 
 
A definition of ‘premises’ would be helpful. 
asylum authorities are providers of high-risk AI 
This could either be a physical location (e.g. 
systems referred to in points 1, 6 and 7 of 
offices) or could be digitally stored in a private 
Annex III, the technical documentation referred 
cloud environment (e. g. a data center) which 
to in Annex IV shall remain within the premises 
can be located off-premise. 
of those authorities. Those authorities shall 
 
ensure that the market surveillance authorities 
Should there not be a distinction between law 
referred to in Article 63(5) and (6), as 
enforcement and immigration and asylum 
applicable, can, upon request, immediately 
authorities in this context? 
access the documentation or obtain a copy 
thereof. Only staff of the market surveillance 
authority holding the appropriate level of 
security clearance shall be allowed to access 
that documentation or any copy thereof. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
3. 
Paragraphs 1 and 2 shall not affect the 
 
 
rights and obligations of the Commission, 
Member States and notified bodies with regard 
to the exchange of information and the 
dissemination of warnings, nor the obligations 
of the parties concerned to provide information 
under criminal law of the Member States. 
 
 
 
4. 
The Commission and Member States may  The Commission and Member States may,  
 
exchange, where necessary, confidential 
notwithstanding paragraphs 1 and 2, exchange, 
information with regulatory authorities of third 
where necessary, confidential information with 
countries with which they have concluded 
regulatory authorities of third countries with 
bilateral or multilateral confidentiality 
which they have concluded bilateral or 
arrangements guaranteeing an adequate level of  multilateral confidentiality arrangements 
confidentiality. 
guaranteeing an adequate level of 
confidentiality 
 
 
 
Article 71 
 
 
Penalties 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
1. 
In compliance with the terms and 
 
 
conditions laid down in this Regulation, 
Member States shall lay down the rules on 
penalties, including administrative fines, 
applicable to infringements of this Regulation 
and shall take all measures necessary to ensure 
that they are properly and effectively 
implemented. The penalties provided for shall 
be effective, proportionate, and dissuasive. They 
shall take into particular account the interests of 
small-scale SME providers, including and start-
up, and their economic viability. 
 
 
 
2. 
The Member States shall notify the 
 
 
Commission of those rules and of those 
measures and shall notify it, without delay, of 
any subsequent amendment affecting them.  
 
 
 
3. 
The following infringements shall be 
 
 
subject to administrative fines of up to 30 000 
000 EUR or, if the offender is company, up to 6 

Presidency compromise text 
Drafting Suggestions 
Comments 
% of its total worldwide annual turnover for the 
preceding financial year, whichever is higher: 
 
 
 
(a)  non-compliance with the prohibition of 
 
 
the artificial intelligence practices referred to in 
Article 5; 
 
 
 
(b)  non-compliance of the AI system with the   
 
requirements laid down in Article 10. 
 
 
 
4. 
The non-compliance of the AI system 
 
 
with any requirements or obligations under this 
Regulation, other than those laid down in 
Articles 5 and 10, shall be subject to 
administrative fines of up to 20 000 000 EUR 
or, if the offender is a company, up to 4 % of its 
total worldwide annual turnover for the 
preceding financial year, whichever is higher. 
 
 
 
5. 
The supply of incorrect, incomplete or 
 
 
misleading information to notified bodies and 

Presidency compromise text 
Drafting Suggestions 
Comments 
national competent authorities in reply to a 
request shall be subject to administrative fines 
of up to 10 000 000 EUR or, if the offender is a 
company, up to 2 % of its total worldwide 
annual turnover for the preceding financial year, 
whichever is higher. 
 
 
 
6. 
When deciding on the amount of the 
 
 
administrative fine in each individual case, all 
relevant circumstances of the specific situation 
shall be taken into account and due regard shall 
be given to the following: 
 
 
 
(a)  the nature, gravity and duration of the 
 
 
infringement and of its consequences; 
 
 
 
(b)  whether administrative fines have been 
 
 
already applied by other market surveillance 
authorities to the same operator for the same 
infringement. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(c)  the size and market share of the operator 
 
 
committing the infringement; 
 
 
 
7. 
Each Member State shall lay down rules 
 
 
on whether and to what extent administrative 
fines may be imposed on public authorities and 
bodies established in that Member State. 
 
 
 
8. 
Depending on the legal system of the 
 
 
Member States, the rules on administrative fines 
may be applied in such a manner that the fines 
are imposed by competent national courts of 
other bodies as applicable in those Member 
States. The application of such rules in those 
Member States shall have an equivalent effect. 
 
 
 
Article 72 
 
 
Administrative fines on Union institutions, 
agencies and bodies 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
1. 
The European Data Protection Supervisor   
 
may impose administrative fines on Union 
institutions, agencies and bodies falling within 
the scope of this Regulation. When deciding 
whether to impose an administrative fine and 
deciding on the amount of the administrative 
fine in each individual case, all relevant 
circumstances of the specific situation shall be 
taken into account and due regard shall be given 
to the following: 
 
 
 
(a)  the nature, gravity and duration of the 
 
 
infringement and of its consequences; 
 
 
 
(b)  the cooperation with the European Data 
 
 
Protection Supervisor in order to remedy the 
infringement and mitigate the possible adverse 
effects of the infringement, including 
compliance with any of the measures previously 
ordered by the European Data Protection 
Supervisor against the Union institution or 

Presidency compromise text 
Drafting Suggestions 
Comments 
agency or body concerned with regard to the 
same subject matter; 
 
 
 
(c)  any similar previous infringements by the 
 
 
Union institution, agency or body; 
 
 
 
2. 
The following infringements shall be 
 
 
subject to administrative fines of up to 500 000 
EUR: 
 
 
 
(a)  non-compliance with the prohibition of 
 
 
the artificial intelligence practices referred to in 
Article 5; 
 
 
 
(b)  non-compliance of the AI system with the   
 
requirements laid down in Article 10. 
 
 
 
3. 
The non-compliance of the AI system 
 
 
with any requirements or obligations under this 
Regulation, other than those laid down in 

Presidency compromise text 
Drafting Suggestions 
Comments 
Articles 5 and 10, shall be subject to 
administrative fines of up to 250 000 EUR. 
 
 
 
4. 
Before taking decisions pursuant to this 
 
 
Article, the European Data Protection 
Supervisor shall give the Union institution, 
agency or body which is the subject of the 
proceedings conducted by the European Data 
Protection Supervisor the opportunity of being 
heard on the matter regarding the possible 
infringement. The European Data Protection 
Supervisor shall base his or her decisions only 
on elements and circumstances on which the 
parties concerned have been able to comment. 
Complainants, if any, shall be associated closely 
with the proceedings. 
 
 
 
5. 
The rights of defense of the parties 
 
 
concerned shall be fully respected in the 
proceedings. They shall be entitled to have 
access to the European Data Protection 

Presidency compromise text 
Drafting Suggestions 
Comments 
Supervisor’s file, subject to the legitimate 
interest of individuals or undertakings in the 
protection of their personal data or business 
secrets. 
 
 
 
6. 
Funds collected by imposition of fines in 
 
 
this Article shall be the income of the general 
budget of the Union. 
 
 
 
TITLE XI 
 
 
 
 
 
DELEGATION OF POWER AND 
 
 
COMMITTEE PROCEDURE  
 
 
 
Article 73 
 
 
Exercise of the delegation 
 
 
 
1. 
The power to adopt delegated acts is 
 
 
conferred on the Commission subject to the 
conditions laid down in this Article. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
2. 
The delegation of power referred to in 
2. 
The delegation of power referred to in 
The Netherlands is of the opinion that articles 4, 
Article 4, Article 7(1), Article 11(3), Article 
Article 4, Article 7(1), Article 11(3), Article 
7(1) and 43(6) should contain references to 
43(5) and (6) and Article 48(5) shall be 
43(5) and (6) and Article 48(5) shall be 
implementing acts rather than delegated acts as 
conferred on the Commission for an a 
conferred on the Commission for an a 
the nature of Annexes I, III and the decision 
indeterminate period of time five years from 
indeterminate period of time five years from 
about the conformity assessment procedures are 
[entering into force of the Regulation]. 
[entering into force of the Regulation]. 
of essential nature and require involvement of 
co-legislators.  
 
 
 
The Commission shall draw up a report in 
 
 
respect of the delegation of power not later 
than nine months before the end of the 5 year 
period. The delegation of power shall be 
tacitly extended for periods of an identical 
duration, unless the European Parliament or 
the Council opposes such extension not later 
than three months before the end of each 
period. 
 
 
 
3. 
The delegation of power referred to in 
3. 
The delegation of power referred to in 
See 73.1 
Article 4, Article 7(1), Article 11(3), Article 
Article 4, Article 7(1), Article 11(3), Article 

Presidency compromise text 
Drafting Suggestions 
Comments 
43(5) and (6) and Article 48(5) may be revoked 
43(5) and (6) and Article 48(5) may be revoked 
at any time by the European Parliament or by 
at any time by the European Parliament or by 
the Council. A decision of revocation shall put 
the Council 
an end to the delegation of power specified in 
that decision. It shall take effect the day 
following that of its publication in the Official 
Journal of the European Union or at a later date 
specified therein. It shall not affect the validity 
of any delegated acts already in force. 
 
 
 
4. 
As soon as it adopts a delegated act, the 
 
 
Commission shall notify it simultaneously to the 
European Parliament and to the Council. 
 
 
 
5. 
Any delegated act adopted pursuant to 
Any delegated act adopted pursuant to Article 4,  See 73.1 
Article 4, Article 7(1), Article 11(3), Article 
Article 7(1), Article 11(3), Article 43(5) and (6) 
43(5) and (6) and Article 48(5) shall enter into 
and Article 48(5) shall 
force only if no objection has been expressed by 
either the European Parliament or the Council 
within a period of three months of notification 
of that act to the European Parliament and the 

Presidency compromise text 
Drafting Suggestions 
Comments 
Council or if, before the expiry of that period, 
the European Parliament and the Council have 
both informed the Commission that they will 
not object. That period shall be extended by 
three months at the initiative of the European 
Parliament or of the Council. 
 
 
 
Article 74 
 
 
Committee procedure 
 
 
 
1. 
The Commission shall be assisted by a 
 
 
committee. That committee shall be a 
committee within the meaning of Regulation 
(EU) No 182/2011. 
 
 
 
2. 
Where reference is made to this 
 
 
paragraph, Article 5 of Regulation (EU) No 
182/2011 shall apply. 
 
 
 
TITLE XII 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
FINAL PROVISIONS  
 
 
 
 
 
Article 75 
 
 
Amendment to Regulation (EC) No 300/2008 
 
 
 
In Article 4(3) of Regulation (EC) No 300/2008,   
 
the following subparagraph is added: 
 
 
 
“When adopting detailed measures related to 
 
 
technical specifications and procedures for 
approval and use of security equipment 
concerning Artificial Intelligence systems in the 
meaning of Regulation (EU) YYY/XX [on 
Artificial Intelligence] of the European 
Parliament and of the Council*, the 
requirements set out in Chapter 2, Title III of 
that Regulation shall be taken into account.” 
 
 
 
__________ 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
* Regulation (EU) YYY/XX [on Artificial 
 
 
Intelligence] (OJ …).” 
 
 
 
Article 76 
 
 
Amendment to Regulation (EU) No 167/2013 
 
 
 
In Article 17(5) of Regulation (EU) No 
 
 
167/2013, the following subparagraph is added: 
 
 
 
“When adopting delegated acts pursuant to the 
 
 
first subparagraph concerning artificial 
intelligence systems which are safety 
components in the meaning of Regulation (EU) 
YYY/XX [on Artificial Intelligence] of the 
European Parliament and of the Council*, the 
requirements set out in Title III, Chapter 2 of 
that Regulation shall be taken into account. 
 
 
 
__________ 
 
 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
* Regulation (EU) YYY/XX [on Artificial 
 
 
Intelligence] (OJ …).” 
 
 
 
Article 77 
 
 
Amendment to Regulation (EU) No 168/2013 
 
 
 
In Article 22(5) of Regulation (EU) No 
 
 
168/2013, the following subparagraph is added: 
 
 
 
“When adopting delegated acts pursuant to the 
 
 
first subparagraph concerning Artificial 
Intelligence systems which are safety 
components in the meaning of Regulation (EU) 
YYY/XX on [Artificial Intelligence] of the 
European Parliament and of the Council*, the 
requirements set out in Title III, Chapter 2 of 
that Regulation shall be taken into account. 
 
 
 
__________ 
 
 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
* Regulation (EU) YYY/XX [on Artificial 
 
 
Intelligence] (OJ …).” 
 
 
 
Article 78 
 
 
Amendment to Directive 2014/90/EU 
 
 
 
In Article 8 of Directive 2014/90/EU, the 
 
 
following paragraph is added: 
 
 
 
“4. For Artificial Intelligence systems which are   
 
safety components in the meaning of Regulation 
(EU) YYY/XX [on Artificial Intelligence] of the 
European Parliament and of the Council*, when 
carrying out its activities pursuant to paragraph 
1 and when adopting technical specifications 
and testing standards in accordance with 
paragraphs 2 and 3, the Commission shall take 
into account the requirements set out in Title III, 
Chapter 2 of that Regulation. 
 
 
 
__________ 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
* Regulation (EU) YYY/XX [on Artificial 
 
 
Intelligence] (OJ …).”. 
 
 
 
Article 79 
 
 
Amendment to Directive (EU) 2016/797 
 
 
 
In Article 5 of Directive (EU) 2016/797, the 
 
 
following paragraph is added: 
 
 
 
“12. When adopting delegated acts pursuant to 
 
 
paragraph 1 and implementing acts pursuant to 
paragraph 11 concerning Artificial Intelligence 
systems which are safety components in the 
meaning of Regulation (EU) YYY/XX [on 
Artificial Intelligence] of the European 
Parliament and of the Council*, the 
requirements set out in Title III, Chapter 2 of 
that Regulation shall be taken into account. 
 
 
 
__________ 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
* Regulation (EU) YYY/XX [on Artificial 
 
 
Intelligence] (OJ …).”. 
 
 
 
Article 80 
 
 
Amendment to Regulation (EU) 2018/858 
 
 
 
In Article 5 of Regulation (EU) 2018/858 the 
 
 
following paragraph is added: 
 
 
 
“4. When adopting delegated acts pursuant to 
 
 
paragraph 3 concerning Artificial Intelligence 
systems which are safety components in the 
meaning of Regulation (EU) YYY/XX [on 
Artificial Intelligence] of the European 
Parliament and of the Council *, the 
requirements set out in Title III, Chapter 2 of 
that Regulation shall be taken into account. 
 
 
 
__________ 
 
 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
* Regulation (EU) YYY/XX [on Artificial 
 
 
Intelligence] (OJ …).”. 
 
 
 
Article 81 
 
 
Amendment to Regulation (EU) 2018/1139 
 
 
 
Regulation (EU) 2018/1139 is amended as 
 
 
follows: 
 
 
 
(1) In Article 17, the following paragraph is 
 
 
added: 
 
 
 
“3. Without prejudice to paragraph 2, when 
 
 
adopting implementing acts pursuant to 
paragraph 1 concerning Artificial Intelligence 
systems which are safety components in the 
meaning of Regulation (EU) YYY/XX [on 
Artificial Intelligence] of the European 
Parliament and of the Council*, the 
requirements set out in Title III, Chapter 2 of 
that Regulation shall be taken into account. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
__________ 
 
 
 
 
 
* Regulation (EU) YYY/XX [on Artificial 
 
 
Intelligence] (OJ …).” 
 
 
 
(2) In Article 19, the following paragraph is 
 
 
added: 
 
 
 
“4. When adopting delegated acts pursuant to 
 
 
paragraphs 1 and 2 concerning Artificial 
Intelligence systems which are safety 
components in the meaning of Regulation (EU) 
YYY/XX [on Artificial Intelligence], the 
requirements set out in Title III, Chapter 2 of 
that Regulation shall be taken into account.” 
 
 
 
(3) In Article 43, the following paragraph is 
 
 
added: 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
“4. When adopting implementing acts pursuant 
 
 
to paragraph 1 concerning Artificial Intelligence 
systems which are safety components in the 
meaning of Regulation (EU) YYY/XX [on 
Artificial Intelligence], the requirements set out 
in Title III, Chapter 2 of that Regulation shall be 
taken into account.” 
 
 
 
(4) In Article 47, the following paragraph is 
 
 
added: 
 
 
 
“3. When adopting delegated acts pursuant to 
 
 
paragraphs 1 and 2 concerning Artificial 
Intelligence systems which are safety 
components in the meaning of Regulation (EU) 
YYY/XX [on Artificial Intelligence], the 
requirements set out in Title III, Chapter 2 of 
that Regulation shall be taken into account.” 
 
 
 
(5) In Article 57, the following paragraph is 
 
 
added: 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
“When adopting those implementing acts 
 
 
concerning Artificial Intelligence systems which 
are safety components in the meaning of 
Regulation (EU) YYY/XX [on Artificial 
Intelligence], the requirements set out in Title 
III, Chapter 2 of that Regulation shall be taken 
into account.” 
 
 
 
(6) In Article 58, the following paragraph is 
 
 
added: 
 
 
 
“3. When adopting delegated acts pursuant to 
 
 
paragraphs 1 and 2 concerning Artificial 
Intelligence systems which are safety 
components in the meaning of Regulation (EU) 
YYY/XX [on Artificial Intelligence] , the 
requirements set out in Title III, Chapter 2 of 
that Regulation shall be taken into account.”. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
Article 82 
 
 
Amendment to Regulation (EU) 2019/2144 
 
 
 
In Article 11 of Regulation (EU) 2019/2144, the   
 
following paragraph is added: 
 
 
 
“3. When adopting the implementing acts 
 
 
pursuant to paragraph 2, concerning artificial 
intelligence systems which are safety 
components in the meaning of Regulation (EU) 
YYY/XX [on Artificial Intelligence] of the 
European Parliament and of the Council*, the 
requirements set out in Title III, Chapter 2 of 
that Regulation shall be taken into account. 
 
 
 
__________ 
 
 
 
 
 
* Regulation (EU) YYY/XX [on Artificial 
 
 
Intelligence] (OJ …).”. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
Article 83 
 
 
AI systems already placed on the market or put 
into service 
 
 
 
1. 
This Regulation shall not apply to the AI 
 
 
systems which are components of the large-
scale IT systems established by the legal acts 
listed in Annex IX that have been placed on the 
market or put into service before [12 months 
after the date of application of this Regulation 
referred to in Article 85(2)], unless the 
replacement or amendment of those legal acts 
leads to a significant change in the design or 
intended purpose of the AI system or AI 
systems concerned. 
 
 
 
The requirements laid down in this Regulation 
 
 
shall be taken into account, where applicable, in 
the evaluation of each large-scale IT systems 
established by the legal acts listed in Annex IX 

Presidency compromise text 
Drafting Suggestions 
Comments 
to be undertaken as provided for in those 
respective acts. 
 
 
 
2. 
This Regulation shall apply to the high-
 
 
risk AI systems, other than the ones referred to 
in paragraph 1, that have been placed on the 
market or put into service before [date of 
application of this Regulation referred to in 
Article 85(2)], only if, from that date, those 
systems are subject to significant changes in 
their design or intended purpose. 
 
 
 
Article 84 
 
 
Evaluation and review 
 
 
 
1. 
The Commission shall assess the need for   
 
amendment of the list in Annex III once a year 
following the entry into force of this Regulation. 
 
 
 
1a.  The Commission shall assess the need 
 
 
for amendment of the list in Annex I every 24 

Presidency compromise text 
Drafting Suggestions 
Comments 
months following the entry into force of this 
Regulation and until the end of the period of 
the delegation of power. The findings of that 
assessment shall be presented to the 
European Parliament and the Council. 
 
 
 
1b.  The Commission shall assess the need 
 
 
for amendment of the list in Annex III every 
24 months following the entry into force of 
this Regulation and until the end of the 
period of the delegation of power. The 
findings of that assessment shall be presented 
to the European Parliament and the Council.
 
 
 
1c The Commission will establish a 
The amendments of annexes I and III need 
multistakeholder expert group to assist with the 
consultation of all relevant stakeholders to 
evaluation of Annexes I and III. 
ensure that the regulation reflects the latest 
insights by a wide range of actors and remains 
effective. The High Level Expert Group could 
serve as example.  

Presidency compromise text 
Drafting Suggestions 
Comments 
2. 
By [three years after the date of 
 
 
application of this Regulation referred to in 
Article 85(2)] and every four years thereafter, 
the Commission shall submit a report on the 
evaluation and review of this Regulation to the 
European Parliament and to the Council. The 
reports shall be made public.   
 
 
 
3. 
The reports referred to in paragraph 2 
 
 
shall devote specific attention to the following: 
 
 
 
(a)  the status of the financial and human 
 
 
resources of the national competent authorities 
in order to effectively perform the tasks 
assigned to them under this Regulation; 
 
 
 
(b)  the state of penalties, and notably 
 
 
administrative fines as referred to in Article 
71(1), applied by Member States to 
infringements of the provisions of this 
Regulation. 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
4. 
Within [three years after the date of 
 
 
application of this Regulation referred to in 
Article 85(2)] and every four years thereafter, 
the Commission shall evaluate the impact and 
effectiveness of codes of conduct to foster the 
application of the requirements set out in Title 
III, Chapter 2 and possibly other additional 
requirements for AI systems other than high-risk 
AI systems. 
 
 
 
5. 
For the purpose of paragraphs 1 to 43 the 
 
 
Board, the Member States and national 
competent authorities shall provide the 
Commission with information on its request. 
 
 
 
6. 
In carrying out the evaluations and 
 
 
reviews referred to in paragraphs 1 to 43 the 
Commission shall take into account the 
positions and findings of the Board, of the 

Presidency compromise text 
Drafting Suggestions 
Comments 
European Parliament, of the Council, and of 
other relevant bodies or sources. 
 
 
 
7. 
The Commission shall, if necessary, 
 
 
submit appropriate proposals to amend this 
Regulation, in particular taking into account 
developments in technology and in the light of 
the state of progress in the information society. 
 
 
 
Article 85 
 
 
Entry into force and application 
 
 
 
1. 
This Regulation shall enter into force on 
 
 
the twentieth day following that of its 
publication in the Official Journal of the 
European Union
 
 
 
2. 
This Regulation shall apply from [24 
 
 
months following the entering into force of the 
Regulation]. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
3. 
By way of derogation from  paragraph 2: 
 
 
 
 
 
(a)  Title III, Chapter 4  and Title VI  shall 
 
 
apply from [three months following the entry 
into force of this Regulation]; 
 
 
 
(b)  Article 71 shall apply from [twelve 
 
 
months following the entry into force of this 
Regulation]. 
 
 
 
This Regulation shall be binding in its entirety 
 
 
and directly applicable in all Member States. 
 
 
 
Done at Brussels, 
 
 
 
 
 
For the European Parliament 
For the 
 
 
Council 
 
 
 
The President 
The President 
 
 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
ANNEX IV 
 
 
TECHNICAL DOCUMENTATION referred 
to in Article 11(1) 
 
 
 
The technical documentation referred to in 
 
 
Article 11(1) shall contain at least the following 
information, as applicable to the relevant AI 
system: 
 
 
 
1. 
A general description of the AI system 
 
 
including: 
 
 
 
(a)  its intended purpose, the person/s 
 
 
developing the system the date and the version 
of the system; 
 
 
 
(b)  how the AI system interacts or can be 
 
 
used to interact with hardware or software that 
is not part of the AI system itself, where 
applicable; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(c)  the versions of relevant software or 
 
 
firmware and any requirement related to version 
update; 
 
 
 
(d)  the description of all forms in which the 
 
 
AI system is placed on the market or put into 
service; 
 
 
 
(e)  the description of hardware on which the 
 
 
AI system is intended to run; 
 
 
 
(f) 
where the AI system is a component of 
 
 
products, photographs or illustrations showing 
external features, marking and internal layout of 
those products; 
 
 
 
(g)  instructions of use for the user and, where   
 
applicable installation instructions; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
2. 
A detailed description of the elements of 
 
 
the AI system and of the process for its 
development, including: 
 
 
 
(a)  the methods and steps performed for the 
 
 
development of the AI system, including, where 
relevant, recourse to pre-trained systems or tools 
provided by third parties and how these have 
been used, integrated or modified by the 
provider; 
 
 
 
(b)  the design specifications of the system, 
 
 
namely the general logic of the AI system and 
of the algorithms; the key design choices 
including the rationale and assumptions made, 
also with regard to persons or groups of persons 
on which the system is intended to be used; the 
main classification choices; what the system is 
designed to optimise for and the relevance of the 
different parameters; the decisions about any 
possible trade-off made regarding the technical 

Presidency compromise text 
Drafting Suggestions 
Comments 
solutions adopted to comply with the 
requirements set out in Title III, Chapter 2; 
 
 
 
(c)  the description of the system architecture 
 
 
explaining how software components build on 
or feed into each other and integrate into the 
overall processing; the computational resources 
used to develop, train, test and validate the AI 
system; 
 
 
 
(d)  where relevant, the data requirements in 
 
 
terms of datasheets describing the training 
methodologies and techniques and the training 
data sets used, including information about the 
provenance of those data sets, their scope and 
main characteristics; how the data was obtained 
and selected; labelling procedures (e.g. for 
supervised learning), data cleaning 
methodologies (e.g. outliers detection); 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(e)  assessment of the human oversight 
 
 
measures needed in accordance with Article 14, 
including an assessment of the technical 
measures needed to facilitate the interpretation 
of the outputs of AI systems by the users, in 
accordance with Articles 13(3)(d); 
 
 
 
(f) 
where applicable, a detailed description of   
 
pre-determined changes  to the AI system and 
its performance, together with all the relevant 
information related to the technical solutions 
adopted to ensure continuous compliance of the 
AI system with the relevant requirements set out 
in Title III, Chapter 2; 
 
 
 
(g)  the validation and testing procedures used,   
 
including information about the validation and 
testing data used and their main characteristics; 
metrics used to measure accuracy, robustness, 
cybersecurity and compliance with other 
relevant requirements set out in Title III, 

Presidency compromise text 
Drafting Suggestions 
Comments 
Chapter 2 as well as potentially discriminatory 
impacts; test logs and all test reports dated and 
signed by the responsible persons, including 
with regard to pre-determined changes as 
referred to under point (f). 
 
 
 
3. 
Detailed information about the 
 
 
monitoring, functioning and control of the AI 
system, in particular with regard to: its 
capabilities and limitations in performance, 
including the degrees of accuracy for specific 
persons or groups of persons on which the 
system is intended to be used and the overall 
expected level of accuracy in relation to its 
intended purpose; the foreseeable unintended 
outcomes and sources of risks to health and 
safety, fundamental rights and discrimination in 
view of the intended purpose of the AI system; 
the human oversight measures needed in 
accordance with Article 14, including the 
technical measures put in place to facilitate the 

Presidency compromise text 
Drafting Suggestions 
Comments 
interpretation of the outputs of AI systems by 
the users; specifications on input data, as 
appropriate; 
 
 
 
4. 
A detailed description of the risk 
 
 
management system in accordance with Article 
9; 
 
 
 
5. 
A description of any change made to the 
 
 
system through its lifecycle; 
 
 
 
6. 
A list of the harmonised standards applied   
 
in full or in part the references of which have 
been published in the Official Journal of the 
European Union; where no such harmonised 
standards have been applied, a detailed 
description of the solutions adopted to meet the 
requirements set out in Title III, Chapter 2, 
including a list of other relevant standards and 
technical specifications applied; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
7. 
A copy of the EU declaration of 
 
 
conformity; 
 
 
 
8. 
A detailed description of the system in 
 
 
place to evaluate the AI system performance in 
the post-market phase in accordance with 
Article 61, including the post-market monitoring 
plan referred to in Article 61(3). 
 
 
 
ANNEX V 
 
 
EU DECLARATION OF CONFORMITY 
 
 
 
The EU declaration of conformity referred to in   
 
Article 48, shall contain all of the following 
information: 
 
 
 
1. 
AI system name and type and any 
 
 
additional unambiguous reference allowing 
identification and traceability of the AI system; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
2. 
Name and address of the provider or, 
 
 
where applicable, their authorised 
representative; 
 
 
 
3. 
A statement that the EU declaration of 
 
 
conformity is issued under the sole 
responsibility of the provider; 
 
 
 
4. 
A statement that the AI system in question   
 
is in conformity with this Regulation and, if 
applicable, with any other relevant Union 
legislation that provides for the issuing of an EU 
declaration of conformity; 
 
 
 
5. 
References to any relevant harmonised 
 
 
standards used or any other common 
specification in relation to which conformity is 
declared; 
 
 
 
6. 
Where applicable, the name and 
 
 
identification number of the notified body, a 

Presidency compromise text 
Drafting Suggestions 
Comments 
description of the conformity assessment 
procedure performed and identification of the 
certificate issued; 
 
 
 
7. 
Place and date of issue of the declaration, 
 
 
name and function of the person who signed it 
as well as an indication for, and on behalf of 
whom, that person signed, signature. 
 
 
 
ANNEX VI 
 
 
CONFORMITY ASSESSMENT 
PROCEDURE BASED ON INTERNAL 
CONTROL 
 
 
 
1. 
The conformity assessment procedure 
 
 
based on internal control is the conformity 
assessment procedure based on points 2 to 4. 
 
 
 
2. 
The provider verifies that the established 
 
 
quality management system is in compliance 
with the requirements of Article 17.  

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
3. 
The provider examines the information 
 
 
contained in the technical documentation in 
order to assess the compliance of the AI system 
with the relevant essential requirements set out 
in Title III, Chapter 2. 
 
 
 
4. 
The provider also verifies that the design 
 
 
and development process of the AI system and 
its post-market monitoring as referred to in 
Article 61 is consistent with the technical 
documentation. 
 
 
 
ANNEX VII 
 
 
CONFORMITY BASED ON ASSESSMENT 
OF QUALITY MANAGEMENT SYSTEM 
AND ASSESSMENT OF TECHNICAL 
DOCUMENTATION 
 
 
 
1. 
Introduction 
 
 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
Conformity based on assessment of quality 
 
 
management system and assessment of the 
technical documentation is the conformity 
assessment procedure based on points 2 to 5.  
 
 
 
2. 
Overview 
 
 
 
 
 
The approved quality management system for 
 
 
the design, development and testing of AI 
systems pursuant to Article 17 shall be 
examined in accordance with point 3 and shall 
be subject to surveillance as specified in point 5. 
The technical documentation of the AI system 
shall be examined in accordance with point 4. 
 
 
 
3. 
Quality management system 
 
 
 
 
 
3.1.  The application of the provider shall 
 
 
include: 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(a)  the name and address of the provider and,   
 
if the application is lodged by the authorised 
representative, their name and address as well; 
 
 
 
(b)  the list of AI systems covered under the 
 
 
same quality management system; 
 
 
 
(c)  the technical documentation for each AI 
 
 
system covered under the same quality 
management system; 
 
 
 
(d)  the documentation concerning the quality 
 
 
management system which shall cover all the 
aspects listed under Article 17; 
 
 
 
(e)  a description of the procedures in place to   
 
ensure that the quality management system 
remains adequate and effective; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
(f) 
a written declaration that the same 
 
 
application has not been lodged with any other 
notified body. 
 
 
 
3.2.  The quality management system shall be 
 
 
assessed by the notified body, which shall 
determine whether it satisfies the requirements 
referred to in Article 17. 
 
 
 
The decision shall be notified to the provider or 
 
 
its authorised representative. 
 
 
 
The notification shall contain the conclusions of   
 
the assessment of the quality management 
system and the reasoned assessment decision. 
 
 
 
3.3.  The quality management system as 
 
 
approved shall continue to be implemented and 
maintained by the provider so that it remains 
adequate and efficient. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
3.4.  Any intended change to the approved 
 
 
quality management system or the list of AI 
systems covered by the latter shall be brought to 
the attention of the notified body by the 
provider. 
 
 
 
The proposed changes shall be examined by the   
 
notified body, which shall decide whether the 
modified quality management system continues 
to satisfy the requirements referred to in point 
3.2 or whether a reassessment is necessary. 
 
 
 
The notified body shall notify the provider of its   
 
decision. The notification shall contain the 
conclusions of the examination of the changes 
and the reasoned assessment decision. 
 
 
 
4. 
Control of the technical documentation. 
 
 
 
 
 
4.1.  In addition to the application referred to in   
 
point 3, an application with a notified body of 

Presidency compromise text 
Drafting Suggestions 
Comments 
their choice shall be lodged by the provider for 
the assessment of the technical documentation 
relating to the AI system which the provider 
intends to place on the market or put into 
service and which is covered by the quality 
management system referred to under point 3. 
 
 
 
4.2.  The application shall include: 
 
 
 
 
 
(a)  the name and address of the provider; 
 
 
 
 
 
(b)  a written declaration that the same 
 
 
application has not been lodged with any other 
notified body; 
 
 
 
(c)  the technical documentation referred to in   
 
Annex IV. 
 
 
 
4.3.  The technical documentation shall be 
 
 
examined by the notified body. To this purpose, 
the notified body shall be granted full access to 

Presidency compromise text 
Drafting Suggestions 
Comments 
the training and testing datasets used by the 
provider, including through application 
programming interfaces (API) or other 
appropriate means and tools enabling remote 
access. 
 
 
 
4.4.  In examining the technical documentation,   
 
the notified body may require that the provider 
supplies further evidence or carries out further 
tests so as to enable a proper assessment of 
conformity of the AI system with the 
requirements set out in Title III, Chapter 2. 
Whenever the notified body is not satisfied with 
the tests carried out by the provider, the notified 
body shall directly carry out adequate tests, as 
appropriate.  
 
 
 
4.5.  Where necessary to assess the conformity 
 
 
of the high-risk AI system with the requirements 
set out in Title III, Chapter 2 and upon a 
reasoned request, the notified body shall also be 

Presidency compromise text 
Drafting Suggestions 
Comments 
granted access to the source code of the AI 
system. 
 
 
 
4.6.  The decision shall be notified to the 
 
 
provider or its authorised representative. The 
notification shall contain the conclusions of the 
assessment of the technical documentation and 
the reasoned assessment decision. 
 
 
 
Where the AI system is in conformity with the 
 
 
requirements set out in Title III, Chapter 2, an 
EU technical documentation assessment 
certificate shall be issued by the notified body. 
The certificate shall indicate the name and 
address of the provider, the conclusions of the 
examination, the conditions (if any) for its 
validity and the data necessary for the 
identification of the AI system. 
 
 
 
The certificate and its annexes shall contain all 
 
 
relevant information to allow the conformity of 

Presidency compromise text 
Drafting Suggestions 
Comments 
the AI system to be evaluated, and to allow for 
control of the AI system while in use, where 
applicable. 
 
 
 
Where the AI system is not in conformity with 
 
 
the requirements set out in Title III, Chapter 2, 
the notified body shall refuse to issue an EU 
technical documentation assessment certificate 
and shall inform the applicant accordingly, 
giving detailed reasons for its refusal. 
 
 
 
Where the AI system does not meet the 
 
 
requirement relating to the data used to train it, 
re-training of the AI system will be needed prior 
to the application for a new conformity 
assessment. In this case, the reasoned 
assessment decision of the notified body 
refusing to issue the EU technical 
documentation assessment certificate shall 
contain specific considerations on the quality 

Presidency compromise text 
Drafting Suggestions 
Comments 
data used to train the AI system, notably on the 
reasons for non-compliance. 
 
 
 
4.7.  Any change to the AI system that could 
 
 
affect the compliance of the AI system with the 
requirements or its intended purpose shall be 
approved by the notified body which issued the 
EU technical documentation assessment 
certificate. The provider shall inform such 
notified body of its intention to introduce any of 
the above-mentioned changes or if it becomes 
otherwise aware of the occurrence of such 
changes. The intended changes shall be assessed 
by the notified body which shall decide whether 
those changes require a new conformity 
assessment in accordance with Article 43(4) or 
whether they could be addressed by means of a 
supplement to the EU technical documentation 
assessment certificate. In the latter case, the 
notified body shall assess the changes, notify the 
provider of its decision and, where the changes 

Presidency compromise text 
Drafting Suggestions 
Comments 
are approved, issue to the provider a supplement 
to the EU technical documentation assessment 
certificate. 
 
 
 
5. 
Surveillance of the approved quality 
 
 
management system. 
 
 
 
5.1.  The purpose of the surveillance carried 
 
 
out by the notified body referred to in Point 3 is 
to make sure that the provider duly fulfils the 
terms and conditions of the approved quality 
management system. 
 
 
 
5.2.  For assessment purposes, the provider 
 
 
shall allow the notified body to access the 
premises where the design, development, testing 
of the AI systems is taking place. The provider 
shall further share with the notified body all 
necessary information. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
5.3.  The notified body shall carry out periodic 
 
 
audits to make sure that the provider maintains 
and applies the quality management system and 
shall provide the provider with an audit report. 
In the context of those audits, the notified body 
may carry out additional tests of the AI systems 
for which an EU technical documentation 
assessment certificate was issued. 
 
 
 
ANNEX VIII 
 
 
INFORMATION TO BE SUBMITTED 
UPON THE REGISTRATION OF HIGH-
RISK AI SYSTEMS IN ACCORDANCE 
WITH ARTICLE 51 
 
 
 
The following information shall be provided and   
 
thereafter kept up to date with regard to high-
risk AI systems to be registered in accordance 
with Article 51. 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
1. 
Name, address and contact details of the 
1. 
Title position Name, address and contact 
People often change jobs, and the name used in 
provider; 
details of the provider; 
the database may not be up to date anymore. 
Referring to the position/job of someone is more 
resistant to changes.  
 
 
 
2. 
Where submission of information is 
2. 
Where submission of information is 
People often change jobs, and the name used in 
carried out by another person on behalf of the 
carried out by another person on behalf of the 
the database may not be up to date anymore. 
provider, the name, address and contact details 
provider, the title position name, address and 
Referring to the position/job of someone is more 
of that person; 
contact details of that person; 
resistant to changes. 
 
 
 
3. 
Name, address and contact details of the 
3. 
Title position Name, address and contact 
People often change jobs, and the name used in 
authorised representative, where applicable; 
details of the authorised representative, where 
the database may not be up to date anymore. 
applicable; 
Referring to the position/job of someone is more 
resistant to changes. 
 
3a. Title position Name, address and contact 
Besides the provider, it should be clear also 
details of the user, where applicable; 
which organisations are using these systems. 
4. 
AI system trade name and any additional 
 
 
unambiguous reference allowing identification 
and traceability of the AI system; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
5. 
Description of the intended purpose of the  Description of the intended purpose of the AI 
Intended purpose could differ from actual use 
AI system; 
system, the context and actual purpose of 
deployment (if different from intended purpose), 
and the designation of impacted persons; 
 
 
 
6. 
Status of the AI system (on the market, or   
 
in service; no longer placed on the market/in 
service, recalled); 
 
 
 
7. 
Type, number and expiry date of the 
 
 
certificate issued by the notified body and the 
name or identification number of that notified 
body, when applicable; 
 
 
 
8. 
A scanned copy of the certificate referred 
 
 
to in point 7, when applicable; 
 
 
 
9. 
Member States in which the AI system is 
 
 
or has been placed on the market, put into 
service or made available in the Union; 
 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
10.  A copy of the EU declaration of 
 
 
conformity referred to in Article 48; 
 
 
 
11.  Electronic instructions for use; this 
 
How does the exemption ‘this information shall 
information shall not be provided for high-risk 
not be provided for high-risk AI systems in the 
AI systems in the areas of law enforcement and 
areas of law enforcement (…)’ relate to the 
migration, asylum and border control 
exception regarding tax and customs authorities, 
management referred to in Annex III, points 1, 6 
laid down in preamble no. 38?  
and 7. 
 
 
 
12.  URL for additional information (optional).   
 
 
 
 
ANNEX IX 
 
 
UNION LEGISLATION ON LARGE-
SCALE IT SYSTEMS IN THE AREA OF 
FREEDOM, SECURITY AND JUSTICE 
 
 
 
1. 
Schengen Information System 
 
 
 
 
 
(a)  Regulation (EU) 2018/1860 of the 
 
 
European Parliament and of the Council of 28 

Presidency compromise text 
Drafting Suggestions 
Comments 
November 2018 on the use of the Schengen 
Information System for the return of illegally 
staying third-country nationals (OJ L 312, 
7.12.2018, p. 1). 
 
 
 
(b)  Regulation (EU) 2018/1861 of the 
 
 
European Parliament and of the Council of 28 
November 2018 on the establishment, operation 
and use of the Schengen Information System 
(SIS) in the field of border checks, and 
amending the Convention implementing the 
Schengen Agreement, and amending and 
repealing Regulation (EC) No 1987/2006 (OJ L 
312, 7.12.2018, p. 14) 
 
 
 
(c)  Regulation (EU) 2018/1862 of the 
 
 
European Parliament and of the Council of 28 
November 2018 on the establishment, operation 
and use of the Schengen Information System 
(SIS) in the field of police cooperation and 
judicial cooperation in criminal matters, 

Presidency compromise text 
Drafting Suggestions 
Comments 
amending and repealing Council Decision 
2007/533/JHA, and repealing Regulation (EC) 
No 1986/2006 of the European Parliament and 
of the Council and Commission Decision 
2010/261/EU (OJ L 312, 7.12.2018, p. 56). 
 
 
 
2. 
Visa Information System 
 
 
 
 
 
(a)  Proposal for a REGULATION OF THE 
 
 
EUROPEAN PARLIAMENT AND OF THE 
COUNCIL amending Regulation (EC) No 
767/2008, Regulation (EC) No 810/2009, 
Regulation (EU) 2017/2226, Regulation (EU) 
2016/399, Regulation XX/2018 [Interoperability 
Regulation], and Decision 2004/512/EC and 
repealing Council Decision 2008/633/JHA - 
COM(2018) 302 final. To be updated once the 
Regulation is adopted (April/May 2021) by the 
co-legislators. 
 
 
 
3. 
Eurodac 
 
 

Presidency compromise text 
Drafting Suggestions 
Comments 
 
 
 
(a)  Amended proposal for a REGULATION 
 
 
OF THE EUROPEAN PARLIAMENT AND 
OF THE COUNCIL on the establishment of 
'Eurodac' for the comparison of biometric data 
for the effective application of Regulation (EU) 
XXX/XXX [Regulation on Asylum and 
Migration Management] and of Regulation (EU) 
XXX/XXX [Resettlement Regulation], for 
identifying an illegally staying third-country 
national or stateless person and on requests for 
the comparison with Eurodac data by Member 
States' law enforcement authorities and Europol 
for law enforcement purposes and amending 
Regulations (EU) 2018/1240 and (EU) 
2019/818 – COM(2020) 614 final.  
 
 
 
4. 
Entry/Exit System 
 
 
 
 
 
(a)  Regulation (EU) 2017/2226 of the 
 
 
European Parliament and of the Council of 30 

Presidency compromise text 
Drafting Suggestions 
Comments 
November 2017 establishing an Entry/Exit 
System (EES) to register entry and exit data and 
refusal of entry data of third-country nationals 
crossing the external borders of the Member 
States and determining the conditions for access 
to the EES for law enforcement purposes, and 
amending the Convention implementing the 
Schengen Agreement and Regulations (EC) No 
767/2008 and (EU) No 1077/2011 (OJ L 327, 
9.12.2017, p. 20). 
 
 
 
5. 
European Travel Information and 
 
 
Authorisation System 
 
 
 
(a)  Regulation (EU) 2018/1240 of the 
 
 
European Parliament and of the Council of 12 
September 2018 establishing a European Travel 
Information and Authorisation System (ETIAS) 
and amending Regulations (EU) No 1077/2011, 
(EU) No 515/2014, (EU) 2016/399, (EU) 

Presidency compromise text 
Drafting Suggestions 
Comments 
2016/1624 and (EU) 2017/2226 (OJ L 236, 
19.9.2018, p. 1).  
 
 
 
(b)  Regulation (EU) 2018/1241 of the 
 
 
European Parliament and of the Council of 12 
September 2018 amending Regulation (EU) 
2016/794 for the purpose of establishing a 
European Travel Information and Authorisation 
System (ETIAS) (OJ L 236, 19.9.2018, p. 72). 
 
 
 
6. 
European Criminal Records Information 
 
 
System on third-country nationals and stateless 
persons 
 
 
 
(a)  Regulation (EU) 2019/816 of the 
 
 
European Parliament and of the Council of 17 
April 2019 establishing a centralised system for 
the identification of Member States holding 
conviction information on third-country 
nationals and stateless persons (ECRIS-TCN) to 
supplement the European Criminal Records 

Presidency compromise text 
Drafting Suggestions 
Comments 
Information System and amending Regulation 
(EU) 2018/1726 (OJ L 135, 22.5.2019, p. 1).  
 
 
 
7. 
Interoperability  
 
 
 
 
 
(a)  Regulation (EU) 2019/817 of the 
 
 
European Parliament and of the Council of 20 
May 2019 on establishing a framework for 
interoperability between EU information 
systems in the field of borders and visa (OJ L 
135, 22.5.2019, p. 27). 
 
 
 
(b)  Regulation (EU) 2019/818 of the 
 
 
European Parliament and of the Council of 20 
May 2019 on establishing a framework for 
interoperability between EU information 
systems in the field of police and judicial 
cooperation, asylum and migration (OJ L 135, 
22.5.2019, p. 85). 
 
 
 
 
End 
End 

 

Document Outline