Brussels, 06 January 2022
WK 130/2022 INIT
LIMITE
TELECOM
WORKING PAPER
This is a paper intended for a specific community of recipients. Handling and
further distribution are under the sole responsibility of community members.
CONTRIBUTION
From:
General Secretariat of the Council
To:
Working Party on Telecommunications and Information Society
Subject:
Artificial Intelligence Act - IE comments (doc. 14278/21)
Delegations will find in annex the IE comments on Artificial Intelligence Act (doc. 14278/21).
WK 130/2022 INIT
LIMITE
EN
Deadline for comments:
6 January 2022
Presidency compromise text for Artificial Intelligence Act (doc. 14278/21)
Comments and drafting suggestions requested on Articles 30-85, Annexes V-IX)
Important: In order to guarantee that your comments appear accurately, please do not modify the table format by adding/removing/adjusting/merging/splitting cells and rows. This would hinder the
consolidation of your comments. When adding new provisions, please use the free rows provided for this purpose between the provisions. You can add multiple provisions in one row, if necessary, but do not
add or remove rows. For drafting suggestions (2nd column), please copy the relevant sentence or sentences from a given paragraph or point into the second column and add or remove text. Please do not use
track changes, but highlight your additions in yellow or use strikethrough to indicate deletions. You do not need to copy entire paragraphs or points to indicate your changes, copying and modifying the
relevant sentences is sufficient. For comments on specific provisions, please insert your remarks in the 3rd column in the relevant row. If you wish to make general comments on the entire proposal, please do
so in the row containing the title of the proposal (in the 3rd column).
Presidency compromise text
Drafting Suggestions
Comments
Proposal for a REGULATION OF THE
Overall comment: “Ireland very much
EUROPEAN PARLIAMENT AND OF THE
welcomes the compromise text for the AIA and
COUNCIL LAYING DOWN
supports a harmonised regulatory environment
HARMONISED RULES ON ARTIFICIAL
across the EU in relation to AI technology. It is
INTELLIGENCE (ARTIFICIAL
important that the approach is proportionate
INTELLIGENCE ACT) AND AMENDING
but at the same time instils trust in AI Systems
CERTAIN UNION LEGISLATIVE ACTS
and that the individual is protected”.
(Text with EEA relevance)
CHAPTER 4
NOTIFIYING AUTHORITIES AND
NOTIFIED BODIES
1
Presidency compromise text
Drafting Suggestions
Comments
Article 30
Notifying authorities
1.
Each Member State shall designate or
establish a notifying authority responsible for
setting up and carrying out the necessary
procedures for the assessment, designation and
notification of conformity assessment bodies
and for their monitoring.
2.
Member States may designate a national
accreditation body referred to in Regulation
(EC) No 765/2008 as a notifying authority.
3.
Notifying authorities shall be established,
organised and operated in such a way that no
conflict of interest arises with conformity
assessment bodies and the objectivity and
impartiality of their activities are safeguarded.
Presidency compromise text
Drafting Suggestions
Comments
4.
Notifying authorities shall be organised in
such a way that decisions relating to the
notification of conformity assessment bodies are
taken by competent persons different from those
who carried out the assessment of those bodies.
5.
Notifying authorities shall not offer or
provide any activities that conformity
assessment bodies perform or any consultancy
services on a commercial or competitive basis.
6.
Notifying authorities shall safeguard the
confidentiality of the information they obtain.
7.
Notifying authorities shall have a
sufficient number of competent personnel at
their disposal for the proper performance of
their tasks.
8.
Notifying authorities shall make sure that
conformity assessments are carried out in a
Presidency compromise text
Drafting Suggestions
Comments
proportionate manner, avoiding unnecessary
burdens for providers and that notified bodies
perform their activities taking due account of
the size of an undertaking, the sector in which it
operates, its structure and the degree of
complexity of the AI system in question.
Article 31
Application of a conformity assessment body for
notification
1.
Conformity assessment bodies shall
submit an application for notification to the
notifying authority of the Member State in
which they are established.
2.
The application for notification shall be
accompanied by a description of the conformity
assessment activities, the conformity assessment
module or modules and the artificial intelligence
technologies for which the conformity
Presidency compromise text
Drafting Suggestions
Comments
assessment body claims to be competent, as well
as by an accreditation certificate, where one
exists, issued by a national accreditation body
attesting that the conformity assessment body
fulfils the requirements laid down in Article 33.
Any valid document related to existing
designations of the applicant notified body
under any other Union harmonisation legislation
shall be added.
3.
Where the conformity assessment body
concerned cannot provide an accreditation
certificate, it shall provide the notifying
authority with the documentary evidence
necessary for the verification, recognition and
regular monitoring of its compliance with the
requirements laid down in Article 33. For
notified bodies which are designated under any
other Union harmonisation legislation, all
documents and certificates linked to those
designations may be used to support their
Presidency compromise text
Drafting Suggestions
Comments
designation procedure under this Regulation, as
appropriate.
Article 32
Notification procedure
1.
Notifying authorities may notify only
conformity assessment bodies which have
satisfied the requirements laid down in Article
33.
2.
Notifying authorities shall notify the
Commission and the other Member States using
the electronic notification tool developed and
managed by the Commission.
3.
The notification shall include full details
of the conformity assessment activities, the
conformity assessment module or modules and
the artificial intelligence technologies
concerned.
Presidency compromise text
Drafting Suggestions
Comments
4.
The conformity assessment body
concerned may perform the activities of a
notified body only where no objections are
raised by the Commission or the other Member
States within one month of a notification.
5.
Notifying authorities shall notify the
Commission and the other Member States of
any subsequent relevant changes to the
notification.
Article 33
Notified bodies
1.
Notified bodies shall verify the conformity
of high-risk AI system in accordance with the
conformity assessment procedures referred to in
Article 43.
Presidency compromise text
Drafting Suggestions
Comments
2.
Notified bodies shall satisfy the
organisational, quality management, resources
and process requirements that are necessary to
fulfil their tasks.
3.
The organisational structure, allocation of
responsibilities, reporting lines and operation of
notified bodies shall be such as to ensure that
there is confidence in the performance by and in
the results of the conformity assessment
activities that the notified bodies conduct.
4.
Notified bodies shall be independent of
the provider of a high-risk AI system in relation
to which it performs conformity assessment
activities. Notified bodies shall also be
independent of any other operator having an
economic interest in the high-risk AI system
that is assessed, as well as of any competitors of
the provider.
Presidency compromise text
Drafting Suggestions
Comments
5.
Notified bodies shall be organised and
operated so as to safeguard the independence,
objectivity and impartiality of their activities.
Notified bodies shall document and implement a
structure and procedures to safeguard
impartiality and to promote and apply the
principles of impartiality throughout their
organisation, personnel and assessment
activities.
6.
Notified bodies shall have documented
procedures in place ensuring that their
personnel, committees, subsidiaries,
subcontractors and any associated body or
personnel of external bodies respect the
confidentiality of the information which comes
into their possession during the performance of
conformity assessment activities, except when
disclosure is required by law. The staff of
notified bodies shall be bound to observe
professional secrecy with regard to all
Presidency compromise text
Drafting Suggestions
Comments
information obtained in carrying out their tasks
under this Regulation, except in relation to the
notifying authorities of the Member State in
which their activities are carried out.
7.
Notified bodies shall have procedures for
the performance of activities which take due
account of the size of an undertaking, the sector
in which it operates, its structure, the degree of
complexity of the AI system in question.
8.
Notified bodies shall take out appropriate
liability insurance for their conformity
assessment activities, unless liability is assumed
by the Member State concerned in accordance
with national law or that Member State is
directly responsible for the conformity
assessment.
9.
Notified bodies shall be capable of
carrying out all the tasks falling to them under
Presidency compromise text
Drafting Suggestions
Comments
this Regulation with the highest degree of
professional integrity and the requisite
competence in the specific field, whether those
tasks are carried out by notified bodies
themselves or on their behalf and under their
responsibility.
10. Notified bodies shall have sufficient
internal competences to be able to effectively
evaluate the tasks conducted by external parties
on their behalf. To that end, at all times and for
each conformity assessment procedure and each
type of high-risk AI system in relation to which
they have been designated, the notified body
shall have permanent availability of sufficient
administrative, technical and scientific
personnel who possess experience and
knowledge relating to the relevant artificial
intelligence technologies, data and data
computing and to the requirements set out in
Chapter 2 of this Title.
Presidency compromise text
Drafting Suggestions
Comments
11. Notified bodies shall participate in
coordination activities as referred to in Article
38. They shall also take part directly or be
represented in European standardisation
organisations, or ensure that they are aware and
up to date in respect of relevant standards.
12. Notified bodies shall make available and
submit upon request all relevant documentation,
including the providers’ documentation, to the
notifying authority referred to in Article 30 to
allow it to conduct its assessment, designation,
notification, monitoring and surveillance
activities and to facilitate the assessment
outlined in this Chapter.
Article 34
Subsidiaries of and subcontracting by notified
bodies
Presidency compromise text
Drafting Suggestions
Comments
1.
Where a notified body subcontracts
specific tasks connected with the conformity
assessment or has recourse to a subsidiary, it
shall ensure that the subcontractor or the
subsidiary meets the requirements laid down in
Article 33 and shall inform the notifying
authority accordingly.
2.
Notified bodies shall take full
responsibility for the tasks performed by
subcontractors or subsidiaries wherever these
are established.
3.
Activities may be subcontracted or carried
out by a subsidiary only with the agreement of
the provider.
4.
Notified bodies shall keep at the disposal
of the notifying authority the relevant
documents concerning the assessment of the
qualifications of the subcontractor or the
Presidency compromise text
Drafting Suggestions
Comments
subsidiary and the work carried out by them
under this Regulation.
Article 35
Identification numbers and lists of notified
bodies designated under this Regulation
1.
The Commission shall assign an
identification number to notified bodies. It shall
assign a single number, even where a body is
notified under several Union acts.
2.
The Commission shall make publicly
available the list of the bodies notified under
this Regulation, including the identification
numbers that have been assigned to them and
the activities for which they have been notified.
The Commission shall ensure that the list is kept
up to date.
Presidency compromise text
Drafting Suggestions
Comments
Article 36
Changes to notifications
1.
Where a notifying authority has suspicions
or has been informed that a notified body no
longer meets the requirements laid down in
Article 33, or that it is failing to fulfil its
obligations, that authority shall without delay
investigate the matter with the utmost diligence.
In that context, it shall inform the notified body
concerned about the objections raised and give
it the possibility to make its views known. If the
notifying authority comes to the conclusion that
the notified body investigation no longer meets
the requirements laid down in Article 33 or that
it is failing to fulfil its obligations, it shall
restrict, suspend or withdraw the notification as
appropriate, depending on the seriousness of the
failure. It shall also immediately inform the
Commission and the other Member States
accordingly.
Presidency compromise text
Drafting Suggestions
Comments
2.
In the event of restriction, suspension or
withdrawal of notification, or where the notified
body has ceased its activity, the notifying
authority shall take appropriate steps to ensure
that the files of that notified body are either
taken over by another notified body or kept
available for the responsible notifying
authorities at their request.
Article 37
Challenge to the competence of notified bodies
1.
The Commission shall, where necessary,
investigate all cases where there are reasons to
doubt whether a notified body complies with the
requirements laid down in Article 33.
2.
The Notifying authority shall provide the
Commission, on request, with all relevant
Presidency compromise text
Drafting Suggestions
Comments
information relating to the notification of the
notified body concerned.
3.
The Commission shall ensure that all
confidential information obtained in the course
of its investigations pursuant to this Article is
treated confidentially.
4.
Where the Commission ascertains that a
notified body does not meet or no longer meets
the requirements laid down in Article 33, it shall
adopt a reasoned decision requesting the
notifying Member State to take the necessary
corrective measures, including withdrawal of
notification if necessary. That implementing act
shall be adopted in accordance with the
examination procedure referred to in Article
74(2).
Article 38
Coordination of notified bodies
Presidency compromise text
Drafting Suggestions
Comments
1.
The Commission shall ensure that, with
regard to the areas covered by this Regulation,
appropriate coordination and cooperation
between notified bodies active in the conformity
assessment procedures of AI systems pursuant
to this Regulation are put in place and properly
operated in the form of a sectoral group of
notified bodies
.
2.
Member States shall ensure that the bodies
notified by them participate in the work of that
group, directly or by means of designated
representatives.
Article 39
Conformity assessment bodies of third countries
Conformity assessment bodies established under
the law of a third country with which the Union
has concluded an agreement may be authorised
Presidency compromise text
Drafting Suggestions
Comments
to carry out the activities of notified Bodies
under this Regulation.
CHAPTER 5
STANDARDS, CONFORMITY
ASSESSMENT, CERTIFICATES,
REGISTRATION
Article 40
IE notes that Cion intends to devote significiant
Harmonised standards
efforts, along with MS, to make maximum use of
international standards and IE welcomes this.
High-risk AI systems which are in conformity
with harmonised standards or parts thereof the
references of which have been published in the
Official Journal of the European Union shall be
presumed to be in conformity with the
requirements set out in Chapter 2 of this Title, to
Presidency compromise text
Drafting Suggestions
Comments
the extent those standards cover those
requirements.
Article 41
Common specifications
1.
Where harmonised standards referred to in
Article 40 do not exist or where the Commission
considers that the relevant harmonised standards
are insufficient or that there is a need to address
specific safety or fundamental right concerns,
the Commission may, by means of
implementing acts, adopt common
specifications in respect of the requirements set
out in Chapter 2 of this Title. Those
implementing acts shall be adopted in
accordance with the examination procedure
referred to in Article 74(2).
2.
The Commission, when preparing the
common specifications referred to in paragraph
Presidency compromise text
Drafting Suggestions
Comments
1, shall gather the views of relevant bodies or
expert groups established under relevant
sectorial Union law.
3.
High-risk AI systems which are in
conformity with the common specifications
referred to in paragraph 1 shall be presumed to
be in conformity with the requirements set out
in Chapter 2 of this Title, to the extent those
common specifications cover those
requirements.
4.
Where providers do not comply with the
common specifications referred to in paragraph
1, they shall duly justify that they have adopted
technical solutions that are at least equivalent
thereto.
Article 42
Presumption of conformity with certain
requirements
Presidency compromise text
Drafting Suggestions
Comments
1.
Taking into account their intended
purpose, high-risk AI systems that have been
trained and tested on data concerning the
specific geographical, behavioural and
functional setting within which they are
intended to be used shall be presumed to be in
compliance with the requirement set out in
Article 10(4).
2.
High-risk AI systems that have been
certified or for which a statement of conformity
has been issued under a cybersecurity scheme
pursuant to Regulation (EU) 2019/881 of the
European Parliament and of the Council1 and
the references of which have been published in
the Official Journal of the European Union shall
be presumed to be in compliance with the
cybersecurity requirements set out in Article 15
1
Regulation (EU) 2019/881 of the European Parliament and of the Council of 17 April 2019 on ENISA (the European Union Agency for Cybersecurity) and on information and
communications technology cybersecurity certification and repealing Regulation (EU) No 526/2013 (Cybersecurity Act) (OJ L 151, 7.6.2019, p. 1).
Presidency compromise text
Drafting Suggestions
Comments
of this Regulation in so far as the cybersecurity
certificate or statement of conformity or parts
thereof cover those requirements.
Article 43
Conformity assessment
1.
For high-risk AI systems listed in point 1
of Annex III, where, in demonstrating the
compliance of a high-risk AI system with the
requirements set out in Chapter 2 of this Title,
the provider has applied harmonised standards
referred to in Article 40, or, where applicable,
common specifications referred to in Article 41,
the provider shall follow one of the following
procedures:
(a) the conformity assessment procedure
based on internal control referred to in Annex
VI;
Presidency compromise text
Drafting Suggestions
Comments
(b) the conformity assessment procedure
based on assessment of the quality management
system and assessment of the technical
documentation, with the involvement of a
notified body, referred to in Annex VII.
Where, in demonstrating the compliance of a
high-risk AI system with the requirements set
out in Chapter 2 of this Title, the provider has
not applied or has applied only in part
harmonised standards referred to in Article 40,
or where such harmonised standards do not exist
and common specifications referred to in Article
41 are not available, the provider shall follow
the conformity assessment procedure set out in
Annex VII.
For the purpose of the conformity assessment
procedure referred to in Annex VII, the provider
may choose any of the notified bodies.
However, when the system is intended to be put
Presidency compromise text
Drafting Suggestions
Comments
into service by law enforcement, immigration or
asylum authorities as well as EU institutions,
bodies or agencies, the market surveillance
authority referred to in Article 63(5) or (6), as
applicable, shall act as a notified body.
2.
For high-risk AI systems referred to in
points 2 to 8 of Annex III, providers shall follow
the conformity assessment procedure based on
internal control as referred to in Annex VI,
which does not provide for the involvement of a
notified body. For high-risk AI systems referred
to in point 5(b) of Annex III, placed on the
market or put into service by credit institutions
regulated by Directive 2013/36/EU, the
conformity assessment shall be carried out as
part of the procedure referred to in Articles 97
to101 of that Directive.
3.
For high-risk AI systems, to which legal
acts listed in Annex II, section A, apply, the
Presidency compromise text
Drafting Suggestions
Comments
provider shall follow the relevant conformity
assessment as required under those legal acts.
The requirements set out in Chapter 2 of this
Title shall apply to those high-risk AI systems
and shall be part of that assessment. Points 4.3.,
4.4., 4.5. and the fifth paragraph of point 4.6 of
Annex VII shall also apply.
For the purpose of that assessment, notified
bodies which have been notified under those
legal acts shall be entitled to control the
conformity of the high-risk AI systems with the
requirements set out in Chapter 2 of this Title,
provided that the compliance of those notified
bodies with requirements laid down in Article
33(4), (9) and (10) has been assessed in the
context of the notification procedure under those
legal acts.
Where the legal acts listed in Annex II, section
A, enable the manufacturer of the product to opt
Presidency compromise text
Drafting Suggestions
Comments
out from a third-party conformity assessment,
provided that that manufacturer has applied all
harmonised standards covering all the relevant
requirements, that manufacturer may make use
of that option only if he has also applied
harmonised standards or, where applicable,
common specifications referred to in Article 41,
covering the requirements set out in Chapter 2
of this Title.
4.
High-risk AI systems shall undergo a new
conformity assessment procedure whenever they
are substantially modified, regardless of whether
the modified system is intended to be further
distributed or continues to be used by the
current user.
For high-risk AI systems that continue to learn
after being placed on the market or put into
service, changes to the high-risk AI system and
its performance that have been pre-determined
Presidency compromise text
Drafting Suggestions
Comments
by the provider at the moment of the initial
conformity assessment and are part of the
information contained in the technical
documentation referred to in point 2(f) of Annex
IV, shall not constitute a substantial
modification.
5.
The Commission is empowered to adopt
delegated acts in accordance with Article 73 for
the purpose of updating Annexes VI and Annex
VII in order to introduce elements of the
conformity assessment procedures that become
necessary in light of technical progress.
6.
The Commission is empowered to adopt
delegated acts to amend paragraphs 1 and 2 in
order to subject high-risk AI systems referred to
in points 2 to 8 of Annex III to the conformity
assessment procedure referred to in Annex VII
or parts thereof. The Commission shall adopt
such delegated acts taking into account the
Presidency compromise text
Drafting Suggestions
Comments
effectiveness of the conformity assessment
procedure based on internal control referred to
in Annex VI in preventing or minimizing the
risks to health and safety and protection of
fundamental rights posed by such systems as
well as the availability of adequate capacities
and resources among notified bodies.
Article 44
Certificates
1.
Certificates issued by notified bodies in
accordance with Annex VII shall be drawn-up
in an official Union language determined by the
Member State in which the notified body is
established or in an official Union language
otherwise acceptable to the notified body.
2.
Certificates shall be valid for the period
they indicate, which shall not exceed five years.
On application by the provider, the validity of a
Presidency compromise text
Drafting Suggestions
Comments
certificate may be extended for further periods,
each not exceeding five years, based on a re-
assessment in accordance with the applicable
conformity assessment procedures.
3.
Where a notified body finds that an AI
system no longer meets the requirements set out
in Chapter 2 of this Title, it shall, taking account
of the principle of proportionality, suspend or
withdraw the certificate issued or impose any
restrictions on it, unless compliance with those
requirements is ensured by appropriate
corrective action taken by the provider of the
system within an appropriate deadline set by the
notified body. The notified body shall give
reasons for its decision.
Article 45
Appeal against decisions of notified bodies
Presidency compromise text
Drafting Suggestions
Comments
Member States shall ensure that an appeal
procedure against decisions of the notified
bodies is available to parties having a legitimate
interest in that decision.
Article 46
Information obligations of notified bodies
1.
Notified bodies shall inform the notifying
authority of the following:
(a) any Union technical documentation
assessment certificates, any supplements to
those certificates, quality management system
approvals issued in accordance with the
requirements of Annex VII;
(b) any refusal, restriction, suspension or
withdrawal of a Union technical documentation
assessment certificate or a quality management
Presidency compromise text
Drafting Suggestions
Comments
system approval issued in accordance with the
requirements of Annex VII;
(c) any circumstances affecting the scope of
or conditions for notification;
(d) any request for information which they
have received from market surveillance
authorities regarding conformity assessment
activities;
(e) on request, conformity assessment
activities performed within the scope of their
notification and any other activity performed,
including cross-border activities and
subcontracting.
2.
Each notified body shall inform the other
notified bodies of:
Presidency compromise text
Drafting Suggestions
Comments
(a) quality management system approvals
which it has refused, suspended or withdrawn,
and, upon request, of quality system approvals
which it has issued;
(b) EU technical documentation assessment
certificates or any supplements thereto which it
has refused, withdrawn, suspended or otherwise
restricted, and, upon request, of the certificates
and/or supplements thereto which it has issued.
3.
Each notified body shall provide the other
notified bodies carrying out similar conformity
assessment activities covering the same artificial
intelligence technologies with relevant
information on issues relating to negative and,
on request, positive conformity assessment
results.
Presidency compromise text
Drafting Suggestions
Comments
Article 47
Derogation from conformity assessment
procedure
1.
By way of derogation from Article 43, any
market surveillance authority may authorise the
placing on the market or putting into service of
specific high-risk AI systems within the territory
of the Member State concerned, for exceptional
reasons of public security or the protection of
life and health of persons, environmental
protection and the protection of key industrial
and infrastructural assets. That authorisation
shall be for a limited period of time, while the
necessary conformity assessment procedures are
being carried out, and shall terminate once those
procedures have been completed. The
completion of those procedures shall be
undertaken without undue delay.
Presidency compromise text
Drafting Suggestions
Comments
2.
The authorisation referred to in paragraph
1 shall be issued only if the market surveillance
authority concludes that the high-risk AI system
complies with the requirements of Chapter 2 of
this Title. The market surveillance authority
shall inform the Commission and the other
Member States of any authorisation issued
pursuant to paragraph 1.
3.
Where, within 15 calendar days of receipt
of the information referred to in paragraph 2, no
objection has been raised by either a Member
State or the Commission in respect of an
authorisation issued by a market surveillance
authority of a Member State in accordance with
paragraph 1, that authorisation shall be deemed
justified.
4.
Where, within 15 calendar days of receipt
of the notification referred to in paragraph 2,
objections are raised by a Member State against
Presidency compromise text
Drafting Suggestions
Comments
an authorisation issued by a market surveillance
authority of another Member State, or where the
Commission considers the authorisation to be
contrary to Union law or the conclusion of the
Member States regarding the compliance of the
system as referred to in paragraph 2 to be
unfounded, the Commission shall without delay
enter into consultation with the relevant
Member State; the operator(s) concerned shall
be consulted and have the possibility to present
their views. In view thereof, the Commission
shall decide whether the authorisation is
justified or not. The Commission shall address
its decision to the Member State concerned and
the relevant operator or operators.
5.
If the authorisation is considered
unjustified, this shall be withdrawn by the
market surveillance authority of the Member
State concerned.
Presidency compromise text
Drafting Suggestions
Comments
6.
By way of derogation from paragraphs 1
to 5, for high-risk AI systems intended to be
used as safety components of devices, or which
are themselves devices, covered by Regulation
(EU) 2017/745 and Regulation (EU) 2017/746,
Article 59 of Regulation (EU) 2017/745 and
Article 54 of Regulation (EU) 2017/746 shall
apply also with regard to the derogation from
the conformity assessment of the compliance
with the requirements set out in Chapter 2 of
this Title.
Article 48
EU declaration of conformity
1.
The provider shall draw up a written EU
declaration of conformity for each AI system
and keep it at the disposal of the national
competent authorities for 10 years after the AI
system has been placed on the market or put
into service. The EU declaration of conformity
Presidency compromise text
Drafting Suggestions
Comments
shall identify the AI system for which it has
been drawn up. A copy of the EU declaration of
conformity shall be given to the relevant
national competent authorities upon request.
2.
The EU declaration of conformity shall
state that the high-risk AI system in question
meets the requirements set out in Chapter 2 of
this Title. The EU declaration of conformity
shall contain the information set out in Annex V
and shall be translated into an official Union
language or languages required by the Member
State(s) in which the high-risk AI system is
made available.
3.
Where high-risk AI systems are subject to
other Union harmonisation legislation which
also requires an EU declaration of conformity, a
single EU declaration of conformity shall be
drawn up in respect of all Union legislations
applicable to the high-risk AI system. The
Presidency compromise text
Drafting Suggestions
Comments
declaration shall contain all the information
required for identification of the Union
harmonisation legislation to which the
declaration relates.
4.
By drawing up the EU declaration of
conformity, the provider shall assume
responsibility for compliance with the
requirements set out in Chapter 2 of this Title.
The provider shall keep the EU declaration of
conformity up-to-date as appropriate.
5.
The Commission shall be empowered to
adopt delegated acts in accordance with Article
73 for the purpose of updating the content of the
EU declaration of conformity set out in Annex
V in order to introduce elements that become
necessary in light of technical progress.
Article 49
CE marking of conformity
Presidency compromise text
Drafting Suggestions
Comments
1.
The CE marking shall be affixed visibly,
Our understanding is that instances where CE
legibly and indelibly for high-risk AI systems.
marking cannot be affixed to the packaging, can
Where that is not possible or not warranted on
instead be done on the accompanying
account of the nature of the high-risk AI system,
documentation would include systems that are
it shall be affixed to the packaging or to the
deployed in an online domain. IE seeks
accompanying documentation, as appropriate.
clarification that this interpretation is correct.
2.
The CE marking referred to in paragraph 1
of this Article shall be subject to the general
principles set out in Article 30 of Regulation
(EC) No 765/2008.
3.
Where applicable, the CE marking shall
be followed by the identification number of the
notified body responsible for the conformity
assessment procedures set out in Article 43. The
identification number shall also be indicated in
any promotional material which mentions that
Presidency compromise text
Drafting Suggestions
Comments
the high-risk AI system fulfils the requirements
for CE marking.
Article 50
Document retention
The provider shall, for a period ending 10 years
after the AI system has been placed on the
market or put into service, keep at the disposal
of the national competent authorities:
(a) the technical documentation referred to in
Article 11;
(b) the documentation concerning the quality
management system referred to Article 17;
(c) the documentation concerning the changes
approved by notified bodies where applicable;
Presidency compromise text
Drafting Suggestions
Comments
(d) the decisions and other documents issued
by the notified bodies where applicable;
(e) the EU declaration of conformity referred
to in Article 48.
Article 51
Registration
Before placing on the market or putting into
service a high-risk AI system referred to in
Article 6(2), the provider or, where applicable,
the authorised representative shall register that
system in the EU database referred to in Article
60.
TITLE IV
TRANSPARENCY OBLIGATIONS
FOR CERTAIN AI SYSTEMS
Presidency compromise text
Drafting Suggestions
Comments
Article 52
Transparency obligations for certain AI systems
1.
Providers shall ensure that AI systems
intended to interact with natural persons are
designed and developed in such a way that
natural persons are informed that they are
interacting with an AI system, unless this is
obvious from the circumstances and the context
of use. This obligation shall not apply to AI
systems authorised by law to detect, prevent,
investigate and prosecute criminal offences,
unless those systems are available for the public
to report a criminal offence.
2.
Users of an emotion recognition system or
a biometric categorisation system shall inform
of the operation of the system the natural
persons exposed thereto. This obligation shall
not apply to AI systems used for biometric
Presidency compromise text
Drafting Suggestions
Comments
categorisation, which are permitted by law to
detect, prevent and investigate criminal
offences.
3.
Users of an AI system that generates or
manipulates image, audio or video content that
appreciably resembles existing persons, objects,
places or other entities or events and would
falsely appear to a person to be authentic or
truthful (‘deep fake’), shall disclose that the
content has been artificially generated or
manipulated.
However, the first subparagraph shall not apply
where the use is authorised by law to detect,
prevent, investigate and prosecute criminal
offences or it is necessary for the exercise of the
right to freedom of expression and the right to
freedom of the arts and sciences guaranteed in
the Charter of Fundamental Rights of the EU,
Presidency compromise text
Drafting Suggestions
Comments
and subject to appropriate safeguards for the
rights and freedoms of third parties.
4.
Paragraphs 1, 2 and 3 shall not affect the
requirements and obligations set out in Title III
of this Regulation.
TITLE IVA
GENERAL PURPOSE AI SYSTEMS
Article 52a
General purpose AI systems
1.
The placing on the market, putting into
service or use of general purpose AI systems
shall not, by themselves only, make those
systems subject to the provisions of this
Regulation.
Presidency compromise text
Drafting Suggestions
Comments
2.
Any person who places on the market
or puts into service under its own name or
trademark or uses a general purpose AI
system made available on the market or put
into service for an intended purpose that
makes it subject to the provisions of this
Regulation shall be considered the provider
of the AI system subject to the provisions of
this Regulation.
3.
Paragraph 2 shall apply, mutatis
mutandis, to any person who integrates a
general purpose AI system made available on
the market, with or without modifying it, into
an AI system whose intended purpose makes
it subject to the provisions of this Regulation.
4.
The provisions of this Article shall
apply irrespective of whether the general
Presidency compromise text
Drafting Suggestions
Comments
purpose AI system is open source software or
not.
TITLE V
MEASURES IN SUPPORT OF
INNOVATION
Article 53
AI regulatory sandboxes
1.
AI regulatory sandboxes established by
one or more Member States competent
authorities or the European Data Protection
Supervisor shall provide a controlled
environment that facilitates the development,
testing and validation of innovative AI systems
for a limited time before their placement on the
market or putting into service
pursuant to a
specific plan. This shall take place under the
Presidency compromise text
Drafting Suggestions
Comments
direct supervision and guidance by the
competent authorities with a view to ensuring
compliance with the requirements of this
Regulation and, where relevant, other Union and
Member States legislation supervised within the
sandbox.
2.
Member States shall ensure that to the
extent the innovative AI systems involve the
processing of personal data or otherwise fall
under the supervisory remit of other national
authorities or competent authorities providing or
supporting access to data, the national data
protection authorities and those other national
authorities are associated to the operation of the
AI regulatory sandbox.
3.
The AI regulatory sandboxes shall not
affect the supervisory and corrective powers of
the competent authorities. Any significant risks
to health and safety and fundamental rights
Presidency compromise text
Drafting Suggestions
Comments
identified during the development and testing of
such systems shall result in immediate
mitigation and, failing that, in the suspension of
the development and testing process until such
mitigation takes place.
4.
Participants in the AI regulatory sandbox
shall remain liable under applicable Union and
Member States liability legislation for any harm
inflicted on third parties as a result from the
experimentation taking place in the sandbox.
5.
Member States’ competent authorities that
have established AI regulatory sandboxes shall
coordinate their activities and cooperate within
the framework of the European Artificial
Intelligence Board. They shall submit annual
reports to the Board and the Commission on the
results from the implementation of those
scheme, including good practices, lessons learnt
and recommendations on their setup and, where
Presidency compromise text
Drafting Suggestions
Comments
relevant, on the application of this Regulation
and other Union legislation supervised within
the sandbox.
6.
The modalities and the conditions of the
operation of the AI regulatory sandboxes,
including the eligibility criteria and the
procedure for the application, selection,
participation and exiting from the sandbox, and
the rights and obligations of the participants
shall be set out in implementing acts. Those
implementing acts shall be adopted in
accordance with the examination procedure
referred to in Article 74(2).
Article 54
Further processing of personal data for
developing certain AI systems in the public
interest in the AI regulatory sandbox
Presidency compromise text
Drafting Suggestions
Comments
1.
In the AI regulatory sandbox personal data
lawfully collected for other purposes shall be
processed for the purposes of developing and
testing certain innovative AI systems in the
sandbox under the following conditions:
(a) the innovative AI systems shall be
developed for safeguarding substantial public
interest in one or more of the following areas:
(i)
the prevention, investigation, detection or
prosecution of criminal offences or the
execution of criminal penalties, including the
safeguarding against and the prevention of
threats to public security, under the control and
responsibility of the competent authorities. The
processing shall be based on Member State or
Union law;
(ii) public safety and public health, including
disease prevention, control and treatment;
Presidency compromise text
Drafting Suggestions
Comments
(iii) a high level of protection and
improvement of the quality of the environment;
(b) the data processed are necessary for
complying with one or more of the requirements
referred to in Title III, Chapter 2 where those
requirements cannot be effectively fulfilled by
processing anonymised, synthetic or other non-
personal data;
(c) there are effective monitoring mechanisms
to identify if any high risks to the fundamental
rights of the data subjects may arise during the
sandbox experimentation as well as response
mechanism to promptly mitigate those risks and,
where necessary, stop the processing;
(d) any personal data to be processed in the
context of the sandbox are in a functionally
separate, isolated and protected data processing
Presidency compromise text
Drafting Suggestions
Comments
environment under the control of the
participants and only authorised persons have
access to that data;
(e) any personal data processed are not be
transmitted, transferred or otherwise accessed
by other parties;
(f)
any processing of personal data in the
context of the sandbox do not lead to measures
or decisions affecting the data subjects;
(g) any personal data processed in the context
of the sandbox are deleted once the participation
in the sandbox has terminated or the personal
data has reached the end of its retention period;
(h) the logs of the processing of personal data
in the context of the sandbox are kept for the
duration of the participation in the sandbox and
1 year after its termination, solely for the
Presidency compromise text
Drafting Suggestions
Comments
purpose of and only as long as necessary for
fulfilling accountability and documentation
obligations under this Article or other
application Union or Member States legislation;
(i)
complete and detailed description of the
process and rationale behind the training, testing
and validation of the AI system is kept together
with the testing results as part of the technical
documentation in Annex IV;
(j)
a short summary of the AI project
developed in the sandbox, its objectives and
expected results published on the website of the
competent authorities.
2.
Paragraph 1 is without prejudice to Union
or Member States legislation excluding
processing for other purposes than those
explicitly mentioned in that legislation.
Presidency compromise text
Drafting Suggestions
Comments
Article 55
It would be important to ensure the engagement
Measures for SME small-scale providers and
of SMEs (including Start-ups) would not be
users
hindered by undue adminstrative burden on
these enterprises who are vital to growth and
innovation in the AI ecosystem.
1.
Member States shall undertake the
following actions:
(a) provide small-scale SME providers
,
including and start-ups with priority access to
the AI regulatory sandboxes to the extent that
they fulfil the eligibility conditions;
(b) organise specific awareness raising
activities about the application of this
Regulation tailored to the needs of the small-
scale
SME providers and users;
(c) where appropriate, establish a dedicated
channel for communication with small-scale
Presidency compromise text
Drafting Suggestions
Comments
SME providers and user and other innovators to
provide guidance and respond to queries about
the implementation of this Regulation.
2.
The specific interests and needs of the
small-scale
SME providers shall be taken into
account when setting the fees for conformity
assessment under Article 43, reducing those fees
proportionately to their size and market size.
TITLE VI
GOVERNANCE
CHAPTER 1
EUROPEAN ARTIFICIAL INTELLIGENCE
BOARD
Presidency compromise text
Drafting Suggestions
Comments
Article 56
Establishment of the European Artificial
Intelligence Board
1.
A ‘European Artificial Intelligence Board’
(the ‘Board’) is established.
2.
The Board shall provide advice and
assistance to the Commission in order to:
(a) contribute to the effective cooperation of
the national supervisory authorities and the
Commission with regard to matters covered by
this Regulation;
(b) coordinate and contribute to guidance and
analysis by the Commission and the national
supervisory authorities and other competent
authorities on emerging issues across the
internal market with regard to matters covered
by this Regulation;
Presidency compromise text
Drafting Suggestions
Comments
(c) assist the national supervisory authorities
and the Commission in ensuring the consistent
application of this Regulation.
Article 57
Structure of the Board
1.
The Board shall be composed of the
national supervisory authorities, who shall be
represented by the head or equivalent high-level
official of that authority, and the European Data
Protection Supervisor. Other national authorities
may be invited to the meetings, where the issues
discussed are of relevance for them.
2.
The Board shall adopt its rules of
procedure by a simple majority of its members,
following the consent of the Commission. The
rules of procedure shall also contain the
operational aspects related to the execution of
Presidency compromise text
Drafting Suggestions
Comments
the Board’s tasks as listed in Article 58. The
Board may establish sub-groups as appropriate
for the purpose of examining specific questions.
3.
The Board shall be chaired by the
Commission. The Commission shall convene
the meetings and prepare the agenda in
accordance with the tasks of the Board pursuant
to this Regulation and with its rules of
procedure. The Commission shall provide
administrative and analytical support for the
activities of the Board pursuant to this
Regulation.
4.
The Board may invite external experts and
observers to attend its meetings and may hold
exchanges with interested third parties to inform
its activities to an appropriate extent. To that
end the Commission may facilitate exchanges
between the Board and other Union bodies,
offices, agencies and advisory groups.
Presidency compromise text
Drafting Suggestions
Comments
Article 58
Tasks of the Board
When providing advice and assistance to the
Commission in the context of Article 56(2), the
Board shall in particular:
(a) collect and share expertise and best
practices among Member States;
(b) contribute to uniform administrative
practices in the Member States, including for the
functioning of regulatory sandboxes referred to
in Article 53;
(c) issue opinions, recommendations or
written contributions on matters related to the
implementation of this Regulation, in particular
Presidency compromise text
Drafting Suggestions
Comments
(i)
on technical specifications or existing
standards regarding the requirements set out in
Title III, Chapter 2,
(ii) on the use of harmonised standards or
common specifications referred to in Articles 40
and 41,
(iii) on the preparation of guidance documents,
including the guidelines concerning the setting
of administrative fines referred to in Article 71.
;
(d) issue an advisory opinion on the need
for amendment of Annex I and Annex III,
including in light of available evidence.
CHAPTER 2
NATIONAL COMPETENT AUTHORITIES
Presidency compromise text
Drafting Suggestions
Comments
Article 59
Adequate support and time will be needed for
Designation of national competent authorities
Market Surveillance Authorities, particularly
where they have no previoius experience in
testing high risk AI systems and will need to be
upskilled to implement the requirements of the
draft AI regulation.
1.
National competent authorities shall be
established or designated by each Member State
for the purpose of ensuring the application and
implementation of this Regulation. National
competent authorities shall be organised so as to
safeguard the objectivity and impartiality of
their activities and tasks.
2.
Each Member State shall designate a
national supervisory authority among the
national competent authorities. The national
supervisory authority shall act as notifying
authority and market surveillance authority
Presidency compromise text
Drafting Suggestions
Comments
unless a Member State has organisational and
administrative reasons to designate more than
one authority.
3.
Member States shall inform the
Commission of their designation or designations
and, where applicable, the reasons for
designating more than one authority.
4.
Member States shall ensure that national
competent authorities are provided with
adequate financial and human resources to fulfil
their tasks under this Regulation. In particular,
national competent authorities shall have a
sufficient number of personnel permanently
available whose competences and expertise
shall include an in-depth understanding of
artificial intelligence technologies, data and data
computing, fundamental rights, health and
safety risks and knowledge of existing standards
and legal requirements.
Presidency compromise text
Drafting Suggestions
Comments
5.
Member States shall report to the
Commission on an annual basis on the status of
the financial and human resources of the
national competent authorities with an
assessment of their adequacy. The Commission
shall transmit that information to the Board for
discussion and possible recommendations.
6.
The Commission shall facilitate the
exchange of experience between national
competent authorities.
7.
National competent authorities may
provide guidance and advice on the
implementation of this Regulation, including
tailored to small-scale
SME providers.
Whenever national competent authorities intend
to provide guidance and advice with regard to
an AI system in areas covered by other Union
legislation, the competent national authorities
Presidency compromise text
Drafting Suggestions
Comments
under that Union legislation shall be consulted,
as appropriate. Member States may also
establish one central contact point for
communication with operators.
8.
When Union institutions, agencies and
bodies fall within the scope of this Regulation,
the European Data Protection Supervisor shall
act as the competent authority for their
supervision.
TITLE VII
EU DATABASE FOR STAND-
ALONE HIGH-RISK AI SYSTEMS
Article 60
EU database for stand-alone high-risk AI
systems
Presidency compromise text
Drafting Suggestions
Comments
1.
The Commission shall, in collaboration
With respect to the transparency obligations for
with the Member States, set up and maintain a
AI systems intended to interact with natural
persons as outlined under article 52, the
EU database containing information referred to
exemption for AI systems authorised by law to
in paragraph 2 concerning high-risk AI systems
detect, prevent, investigate and prosecute
criminal offences takes account of security and
referred to in Article 6(2) which are registered
public safety concerns.
in accordance with Article 51.
We would welcome clarification that the
inclusion of details of such systems in an EU
database for stand-alone high-risk AI systems
takes into account similar concerns.
2.
The data listed in Annex VIII shall be
entered into the EU database by the providers.
The Commission shall provide them with
technical and administrative support.
3.
Information contained in the EU database
shall be accessible to the public.
4.
The EU database shall contain personal
data only insofar as necessary for collecting and
Presidency compromise text
Drafting Suggestions
Comments
processing information in accordance with this
Regulation. That information shall include the
names and contact details of natural persons
who are responsible for registering the system
and have the legal authority to represent the
provider.
5.
The Commission shall be the controller of
the EU database. It shall also ensure to
providers adequate technical and administrative
support.
TITLE VIII
POST-MARKET MONITORING,
INFORMATION SHARING,
MARKET SURVEILLANCE
CHAPTER 1
Presidency compromise text
Drafting Suggestions
Comments
POST-MARKET MONITORING
Article 61
Post-market monitoring by providers and post-
market monitoring plan for high-risk AI systems
1.
Providers shall establish and document a
post-market monitoring system in a manner that
is proportionate to the nature of the artificial
intelligence technologies and the risks of the
high-risk AI system.
2.
The post-market monitoring system shall
actively and systematically collect, document
and analyse relevant data provided by users or
collected through other sources on the
performance of high-risk AI systems throughout
their lifetime, and allow the provider to evaluate
Presidency compromise text
Drafting Suggestions
Comments
the continuous compliance of AI systems with
the requirements set out in Title III, Chapter 2.
3.
The post-market monitoring system shall
be based on a post-market monitoring plan. The
post-market monitoring plan shall be part of the
technical documentation referred to in Annex
IV. The Commission shall adopt an
implementing act laying down detailed
provisions establishing a template for the post-
market monitoring plan and the list of elements
to be included in the plan.
4.
For high-risk AI systems covered by the
legal acts referred to in Annex II, where a post-
market monitoring system and plan is already
established under that legislation, the elements
described in paragraphs 1, 2 and 3 shall be
integrated into that system and plan as
appropriate.
Presidency compromise text
Drafting Suggestions
Comments
The first subparagraph shall also apply to high-
risk AI systems referred to in point 5(b) of
Annex III placed on the market or put into
service by credit institutions regulated by
Directive 2013/36/EU.
CHAPTER 2
SHARING OF INFORMATION ON SERIOUS
INCIDENTS AND MALFUNCTIONING
Article 62
Reporting of serious incidents and of
malfunctioning
1.
Providers of high-risk AI systems placed
on the Union market shall report any serious
incident or any malfunctioning of those systems
which constitutes a breach of obligations under
Union law intended to protect fundamental
Presidency compromise text
Drafting Suggestions
Comments
rights to the market surveillance authorities of
the Member States where that incident or breach
occurred.
Such notification shall be made immediately
after the provider has established a causal link
between the AI system and the
serious incident
or malfunctioning or the reasonable likelihood
of such a link, and, in any event, not later than
15 days after the providers becomes aware of
the serious incident or of the malfunctioning.
2.
Upon receiving a notification related to a
serious incident referred to in Article 3(44)(c)
a breach of obligations under Union law
intended to protect fundamental rights, the
relevant market surveillance authority shall
inform the national public authorities or bodies
referred to in Article 64(3). The Commission
shall develop dedicated guidance to facilitate
compliance with the obligations set out in
Presidency compromise text
Drafting Suggestions
Comments
paragraph 1. That guidance shall be issued 12
months after the entry into force of this
Regulation, at the latest.
3.
For high-risk AI systems referred to in
point 5(b) of Annex III which are placed on the
market or put into service by providers that are
credit institutions regulated by Directive
2013/36/EU and for high-risk AI systems which
are safety components of devices, or are
themselves devices, covered by Regulation (EU)
2017/745 and Regulation (EU) 2017/746, the
notification of serious incidents or
malfunctioning shall be limited to those
referred to in Article 3(44)(c)that that
constitute a breach of obligations under Union
law intended to protect fundamental rights.
CHAPTER 3
Presidency compromise text
Drafting Suggestions
Comments
ENFORCEMENT
Article 63
Market surveillance and control of AI systems in
the Union market
1.
Regulation (EU) 2019/1020 shall apply to
AI systems covered by this Regulation.
However, for the purpose of the effective
enforcement of this Regulation:
(a) any reference to an economic operator
under Regulation (EU) 2019/1020 shall be
understood as including all operators identified
in Title III, Chapter 3
Article 2 of this
Regulation;
(b) any reference to a product under
Regulation (EU) 2019/1020 shall be understood
Presidency compromise text
Drafting Suggestions
Comments
as including all AI systems falling within the
scope of this Regulation.
2.
The national supervisory authority shall
report to the Commission on a regular basis the
outcomes of relevant market surveillance
activities. The national supervisory authority
shall report, without delay, to the Commission
and relevant national competition authorities
any information identified in the course of
market surveillance activities that may be of
potential interest for the application of Union
law on competition rules.
3.
For high-risk AI systems, related to
products to which legal acts listed in Annex II,
section A apply, the market surveillance
authority for the purposes of this Regulation
shall be the authority responsible for market
surveillance activities designated under those
legal acts.
Presidency compromise text
Drafting Suggestions
Comments
4.
For AI systems placed on the market, put
into service or used by financial institutions
regulated by Union legislation on financial
services, the market surveillance authority for
the purposes of this Regulation shall be the
relevant authority responsible for the financial
supervision of those institutions under that
legislation.
5.
For AI systems listed in point 1(a) in so
far as the systems are used for law enforcement
purposes, points 6 and 7 of Annex III, Member
States shall designate as market surveillance
authorities for the purposes of this Regulation
either the competent data protection supervisory
authorities under Directive (EU) 2016/680, or
Regulation 2016/679 or the national competent
authorities supervising the activities of the law
enforcement, immigration or asylum authorities
putting into service or using those systems.
Presidency compromise text
Drafting Suggestions
Comments
6.
Where Union institutions, agencies and
bodies fall within the scope of this Regulation,
the European Data Protection Supervisor shall
act as their market surveillance authority.
7.
Member States shall facilitate the
coordination between market surveillance
authorities designated under this Regulation and
other relevant national authorities or bodies
which supervise the application of Union
harmonisation legislation listed in Annex II or
other Union legislation that might be relevant
for the high-risk AI systems referred to in
Annex III.
Article 64
Access to data and documentation
1.
Access to data and documentation in the
context of their activities, the market
Presidency compromise text
Drafting Suggestions
Comments
surveillance authorities shall be granted full
access to the training, validation and testing
datasets used by the provider, including through
application programming interfaces (‘API’) or
other appropriate technical means and tools
enabling remote access.
2.
Where necessary to assess the conformity
This may be difficult to implement with
of the high-risk AI system with the requirements
complications arising from data sharing rules
set out in Title III, Chapter 2 and upon a
and Intellectual property risk to producer of
reasoned request, the market surveillance
product / service. IE suggests that mitigating
authorities shall be granted access to the source
ways, such as safeguards as to non-disclosure of
code of the AI system.
data and respect for confidentiality/intellectual
property should be included in the regulation so
as not to discourage AI producers from
operating in European territory
3.
National public authorities or bodies
which supervise or enforce the respect of
obligations under Union law protecting
fundamental rights in relation to the use of high-
Presidency compromise text
Drafting Suggestions
Comments
risk AI systems referred to in Annex III shall
have the power to request and access any
documentation created or maintained under this
Regulation when access to that documentation is
necessary for the fulfilment of the competences
under their mandate within the limits of their
jurisdiction. The relevant public authority or
body shall inform the market surveillance
authority of the Member State concerned of any
such request.
4.
By 3 months after the entering into force
of this Regulation, each Member State shall
identify the public authorities or bodies referred
to in paragraph 3 and make a list publicly
available on the website of the national
supervisory authority. Member States shall
notify the list to the Commission and all other
Member States and keep the list up to date.
Presidency compromise text
Drafting Suggestions
Comments
5.
Where the documentation referred to in
paragraph 3 is insufficient to ascertain whether a
breach of obligations under Union law intended
to protect fundamental rights has occurred, the
public authority or body referred to paragraph 3
may make a reasoned request to the market
surveillance authority to organise testing of the
high-risk AI system through technical means.
The market surveillance authority shall organise
the testing with the close involvement of the
requesting public authority or body within
reasonable time following the request.
6.
Any information and documentation
obtained by the national public authorities or
bodies referred to in paragraph 3 pursuant to the
provisions of this Article shall be treated in
compliance with the confidentiality obligations
set out in Article 70.
Presidency compromise text
Drafting Suggestions
Comments
Article 65
Procedure for dealing with AI systems
presenting a risk at national level
1.
AI systems presenting a risk shall be
understood as a product presenting a risk
defined in Article 3, point 19 of Regulation
(EU) 2019/1020 insofar as risks to the health or
safety or to the protection of fundamental rights
of persons are concerned.
2.
Where the market surveillance authority
of a Member State has sufficient reasons to
consider that an AI system presents a risk as
referred to in paragraph 1, they shall carry out
an evaluation of the AI system concerned in
respect of its compliance with all the
requirements and obligations laid down in this
Regulation. When risks to the protection of
fundamental rights are present, the market
surveillance authority shall also inform the
Presidency compromise text
Drafting Suggestions
Comments
relevant national public authorities or bodies
referred to in Article 64(3). The relevant
operators shall cooperate as necessary with the
market surveillance authorities and the other
national public authorities or bodies referred to
in Article 64(3).
Where, in the course of that evaluation, the
market surveillance authority finds that the AI
system does not comply with the requirements
and obligations laid down in this Regulation, it
shall without delay require the relevant operator
to take all appropriate corrective actions to bring
the AI system into compliance, to withdraw the
AI system from the market, or to recall it within
a reasonable period, commensurate with the
nature of the risk, as it may prescribe.
The market surveillance authority shall inform
the relevant notified body accordingly. Article
18 of Regulation (EU) 2019/1020 shall apply to
Presidency compromise text
Drafting Suggestions
Comments
the measures referred to in the second
subparagraph.
3.
Where the market surveillance authority
considers that non-compliance is not restricted
to its national territory, it shall inform the
Commission and the other Member States of the
results of the evaluation and of the actions
which it has required the operator to take.
4.
The operator shall ensure that all
appropriate corrective action is taken in respect
of all the AI systems concerned that it has made
available on the market throughout the Union.
5.
Where the operator of an AI system does
not take adequate corrective action within the
period referred to in paragraph 2, the market
surveillance authority shall take all appropriate
provisional measures to prohibit or restrict the
AI system's being made available on its national
Presidency compromise text
Drafting Suggestions
Comments
market, to withdraw the product from that
market or to recall it. That authority shall inform
the Commission and the other Member States,
without delay, of those measures.
6.
The information referred to in paragraph 5
shall include all available details, in particular
the data necessary for the identification of the
non-compliant AI system, the origin of the AI
system, the nature of the non-compliance
alleged and the risk involved, the nature and
duration of the national measures taken and the
arguments put forward by the relevant operator.
In particular, the market surveillance authorities
shall indicate whether the non-compliance is
due to one or more of the following:
(a) a failure of the AI system to meet
requirements set out in Title III, Chapter 2;
Presidency compromise text
Drafting Suggestions
Comments
(b) shortcomings in the harmonised standards
or common specifications referred to in Articles
40 and 41 conferring a presumption of
conformity.
7.
The market surveillance authorities of the
Member States other than the market
surveillance authority of the Member State
initiating the procedure shall without delay
inform the Commission and the other Member
States of any measures adopted and of any
additional information at their disposal relating
to the non-compliance of the AI system
concerned, and, in the event of disagreement
with the notified national measure, of their
objections.
8.
Where, within three months of receipt of
the information referred to in paragraph 5, no
objection has been raised by either a Member
State or the Commission in respect of a
Presidency compromise text
Drafting Suggestions
Comments
provisional measure taken by a Member State,
that measure shall be deemed justified. This is
without prejudice to the procedural rights of the
concerned operator in accordance with Article
18 of Regulation (EU) 2019/1020.
9.
The market surveillance authorities of all
Member States shall ensure that appropriate
restrictive measures are taken in respect of the
product concerned, such as withdrawal of the
product from their market, without delay.
Article 66
Union safeguard procedure
1.
Where, within three months of receipt of
the notification referred to in Article 65(5),
objections are raised by a Member State against
a measure taken by another Member State, or
where the Commission considers the measure to
be contrary to Union law, the Commission shall
Presidency compromise text
Drafting Suggestions
Comments
without delay enter into consultation with the
relevant Member State and operator or operators
and shall evaluate the national measure. On the
basis of the results of that evaluation, the
Commission shall decide whether the national
measure is justified or not within 9 months from
the notification referred to in Article 65(5) and
notify such decision to the Member State
concerned.
2.
If the national measure is considered
justified, all Member States shall take the
measures necessary to ensure that the non-
compliant AI system is withdrawn from their
market, and shall inform the Commission
accordingly. If the national measure is
considered unjustified, the Member State
concerned shall withdraw the measure.
3.
Where the national measure is considered
justified and the non-compliance of the AI
Presidency compromise text
Drafting Suggestions
Comments
system is attributed to shortcomings in the
harmonised standards or common specifications
referred to in Articles 40 and 41 of this
Regulation, the Commission shall apply the
procedure provided for in Article 11 of
Regulation (EU) No 1025/2012.
Article 67
Compliant AI systems which present a risk
1.
Where, having performed an evaluation
under Article 65, the market surveillance
authority of a Member State finds that although
an AI system is in compliance with this
Regulation, it presents a risk to the health or
safety of persons, to the compliance with
obligations under Union or national law
intended to protect fundamental rights or to
other aspects of public interest protection, it
shall require the relevant operator to take all
appropriate measures to ensure that the AI
Presidency compromise text
Drafting Suggestions
Comments
system concerned, when placed on the market or
put into service, no longer presents that risk, to
withdraw the AI system from the market or to
recall it within a reasonable period,
commensurate with the nature of the risk, as it
may prescribe.
2.
The provider or other relevant operators
shall ensure that corrective action is taken in
respect of all the AI systems concerned that they
have made available on the market throughout
the Union within the timeline prescribed by the
market surveillance authority of the Member
State referred to in paragraph 1.
3.
The Member State shall immediately
inform the Commission and the other Member
States. That information shall include all
available details, in particular the data necessary
for the identification of the AI system
concerned, the origin and the supply chain of
Presidency compromise text
Drafting Suggestions
Comments
the AI system, the nature of the risk involved
and the nature and duration of the national
measures taken.
4.
The Commission shall without delay enter
into consultation with the Member States and
the relevant operator and shall evaluate the
national measures taken. On the basis of the
results of that evaluation, the Commission shall
decide whether the measure is justified or not
and, where necessary, propose appropriate
measures.
5.
The Commission shall address its decision
to the Member States.
Article 68
Formal non-compliance
1.
Where the market surveillance authority
of a Member State makes one of the following
Presidency compromise text
Drafting Suggestions
Comments
findings, it shall require the relevant provider to
put an end to the non-compliance concerned:
(a) the conformity marking has been affixed
in violation of Article 49;
(b) the conformity marking has not been
affixed;
(c) the EU declaration of conformity has not
been drawn up;
(d) the EU declaration of conformity has not
been drawn up correctly;
(e) the identification number of the notified
body, which is involved in the conformity
assessment procedure, where applicable, has not
been affixed;
Presidency compromise text
Drafting Suggestions
Comments
2.
Where the non-compliance referred to in
paragraph 1 persists, the Member State
concerned shall take all appropriate measures to
restrict or prohibit the high-risk AI system being
made available on the market or ensure that it is
recalled or withdrawn from the market.
TITLE IX
CODES OF CONDUCT
Article 69
Codes of conduct
1.
The Commission and the Member States
Where a provider chooses to create a code of
shall encourage and facilitate the drawing up of
conduct to voluntarily apply the mandatory
codes of conduct intended to foster the
requirements for high- risk AI systems, will they
voluntary application to AI systems other than
be able to select certain requirements voluntarily
high-risk AI systems of the requirements set out
or have to apply all requirements if they choose
in Title III, Chapter 2 on the basis of technical
to create a code of conduct?
Presidency compromise text
Drafting Suggestions
Comments
specifications and solutions that are appropriate
means of ensuring compliance with such
requirements in light of the intended purpose of
the systems.
2.
The Commission and the Board shall
encourage and facilitate the drawing up of codes
of conduct intended to foster the voluntary
application to AI systems of requirements
related for example to environmental
sustainability, accessibility for persons with a
disability, stakeholders participation in the
design and development of the AI systems and
diversity of development teams on the basis of
clear objectives and key performance indicators
to measure the achievement of those objectives.
3.
Codes of conduct may be drawn up by
individual providers of AI systems or by
organisations representing them or by both,
including with the involvement of users and any
Presidency compromise text
Drafting Suggestions
Comments
interested stakeholders and their representative
organisations. Codes of conduct may cover one
or more AI systems taking into account the
similarity of the intended purpose of the
relevant systems.
4.
The Commission and the Board shall take
into account the specific interests and needs of
the small-scale
SME providers, including and
start-ups
, when encouraging and facilitating the
drawing up of codes of conduct.
TITLE X
CONFIDENTIALITY AND
PENALTIES
Article 70
Confidentiality
Presidency compromise text
Drafting Suggestions
Comments
1.
National competent authorities and
notified bodies involved in the application of
this Regulation shall respect the confidentiality
of information and data obtained in carrying out
their tasks and activities in such a manner as to
protect, in particular:
(a) intellectual property rights, and
confidential business information or trade
secrets of a natural or legal person, including
source code, except the cases referred to in
Article 5 of Directive 2016/943 on the
protection of undisclosed know-how and
business information (trade secrets) against their
unlawful acquisition, use and disclosure apply.
(b) the effective implementation of this
Regulation, in particular for the purpose of
inspections, investigations or audits;(c) public
and national security interests;
Presidency compromise text
Drafting Suggestions
Comments
(c) integrity of criminal or administrative
proceedings.
2.
Without prejudice to paragraph 1,
information exchanged on a confidential basis
between the national competent authorities and
between national competent authorities and the
Commission shall not be disclosed without the
prior consultation of the originating national
competent authority and the user when high-risk
AI systems referred to in points 1, 6 and 7 of
Annex III are used by law enforcement,
immigration or asylum authorities, when such
disclosure would jeopardise public and national
security interests.
When the law enforcement, immigration or
asylum authorities are providers of high-risk AI
systems referred to in points 1, 6 and 7 of
Annex III, the technical documentation referred
to in Annex IV shall remain within the premises
Presidency compromise text
Drafting Suggestions
Comments
of those authorities. Those authorities shall
ensure that the market surveillance authorities
referred to in Article 63(5) and (6), as
applicable, can, upon request, immediately
access the documentation or obtain a copy
thereof. Only staff of the market surveillance
authority holding the appropriate level of
security clearance shall be allowed to access
that documentation or any copy thereof.
3.
Paragraphs 1 and 2 shall not affect the
rights and obligations of the Commission,
Member States and notified bodies with regard
to the exchange of information and the
dissemination of warnings, nor the obligations
of the parties concerned to provide information
under criminal law of the Member States.
4.
The Commission and Member States may
exchange, where necessary, confidential
information with regulatory authorities of third
Presidency compromise text
Drafting Suggestions
Comments
countries with which they have concluded
bilateral or multilateral confidentiality
arrangements guaranteeing an adequate level of
confidentiality.
Article 71
Penalties
1.
In compliance with the terms and
conditions laid down in this Regulation,
Member States shall lay down the rules on
penalties, including administrative fines,
applicable to infringements of this Regulation
and shall take all measures necessary to ensure
that they are properly and effectively
implemented. The penalties provided for shall
be effective, proportionate, and dissuasive. They
shall take into particular account the interests of
small-scale
SME providers
, including and start-
up
, and their economic viability.
Presidency compromise text
Drafting Suggestions
Comments
2.
The Member States shall notify the
Commission of those rules and of those
measures and shall notify it, without delay, of
any subsequent amendment affecting them.
3.
The following infringements shall be
Fines proposed are at higher level of fines and
subject to administrative fines of up to 30 000
therefore would not be in line with current
000 EUR or, if the offender is company, up to 6
penalties for breaches of market surveillance
% of its total worldwide annual turnover for the
regulations. We note the difference and are
preceding financial year, whichever is higher:
consulting with stakeholders.
(a) non-compliance with the prohibition of
the artificial intelligence practices referred to in
Article 5;
(b) non-compliance of the AI system with the
requirements laid down in Article 10.
4.
The non-compliance of the AI system
with any requirements or obligations under this
Regulation, other than those laid down in
Presidency compromise text
Drafting Suggestions
Comments
Articles 5 and 10, shall be subject to
administrative fines of up to 20 000 000 EUR
or, if the offender is a company, up to 4 % of its
total worldwide annual turnover for the
preceding financial year, whichever is higher.
5.
The supply of incorrect, incomplete or
misleading information to notified bodies and
national competent authorities in reply to a
request shall be subject to administrative fines
of up to 10 000 000 EUR or, if the offender is a
company, up to 2 % of its total worldwide
annual turnover for the preceding financial year,
whichever is higher.
6.
When deciding on the amount of the
administrative fine in each individual case, all
relevant circumstances of the specific situation
shall be taken into account and due regard shall
be given to the following:
Presidency compromise text
Drafting Suggestions
Comments
(a) the nature, gravity and duration of the
infringement and of its consequences;
(b) whether administrative fines have been
already applied by other market surveillance
authorities to the same operator for the same
infringement.
(c) the size and market share of the operator
committing the infringement;
7.
Each Member State shall lay down rules
on whether and to what extent administrative
fines may be imposed on public authorities and
bodies established in that Member State.
8.
Depending on the legal system of the
Member States, the rules on administrative fines
may be applied in such a manner that the fines
are imposed by competent national courts of
other bodies as applicable in those Member
Presidency compromise text
Drafting Suggestions
Comments
States. The application of such rules in those
Member States shall have an equivalent effect.
Article 72
Administrative fines on Union institutions,
agencies and bodies
1.
The European Data Protection Supervisor
may impose administrative fines on Union
institutions, agencies and bodies falling within
the scope of this Regulation. When deciding
whether to impose an administrative fine and
deciding on the amount of the administrative
fine in each individual case, all relevant
circumstances of the specific situation shall be
taken into account and due regard shall be given
to the following:
(a) the nature, gravity and duration of the
infringement and of its consequences;
Presidency compromise text
Drafting Suggestions
Comments
(b) the cooperation with the European Data
Protection Supervisor in order to remedy the
infringement and mitigate the possible adverse
effects of the infringement, including
compliance with any of the measures previously
ordered by the European Data Protection
Supervisor against the Union institution or
agency or body concerned with regard to the
same subject matter;
(c) any similar previous infringements by the
Union institution, agency or body;
2.
The following infringements shall be
subject to administrative fines of up to 500 000
EUR:
(a) non-compliance with the prohibition of
the artificial intelligence practices referred to in
Article 5;
Presidency compromise text
Drafting Suggestions
Comments
(b) non-compliance of the AI system with the
requirements laid down in Article 10.
3.
The non-compliance of the AI system
with any requirements or obligations under this
Regulation, other than those laid down in
Articles 5 and 10, shall be subject to
administrative fines of up to 250 000 EUR.
4.
Before taking decisions pursuant to this
Article, the European Data Protection
Supervisor shall give the Union institution,
agency or body which is the subject of the
proceedings conducted by the European Data
Protection Supervisor the opportunity of being
heard on the matter regarding the possible
infringement. The European Data Protection
Supervisor shall base his or her decisions only
on elements and circumstances on which the
parties concerned have been able to comment.
Presidency compromise text
Drafting Suggestions
Comments
Complainants, if any, shall be associated closely
with the proceedings.
5.
The rights of defense of the parties
concerned shall be fully respected in the
proceedings. They shall be entitled to have
access to the European Data Protection
Supervisor’s file, subject to the legitimate
interest of individuals or undertakings in the
protection of their personal data or business
secrets.
6.
Funds collected by imposition of fines in
this Article shall be the income of the general
budget of the Union.
TITLE XI
DELEGATION OF POWER AND
COMMITTEE PROCEDURE
Presidency compromise text
Drafting Suggestions
Comments
Article 73
Exercise of the delegation
1.
The power to adopt delegated acts is
conferred on the Commission subject to the
conditions laid down in this Article.
2.
The delegation of power referred to in
Article 4, Article 7(1), Article 11(3), Article
43(5) and (6) and Article 48(5) shall be
conferred on the Commission for an
a
indeterminate period of time
five years from
[
entering into force of the Regulation].
The Commission shall draw up a report in
respect of the delegation of power not later
than nine months before the end of the 5 year
period. The delegation of power shall be
tacitly extended for periods of an identical
duration, unless the European Parliament or
Presidency compromise text
Drafting Suggestions
Comments
the Council opposes such extension not later
than three months before the end of each
period.
3.
The delegation of power referred to in
Article 4, Article 7(1), Article 11(3), Article
43(5) and (6) and Article 48(5) may be revoked
at any time by the European Parliament or by
the Council. A decision of revocation shall put
an end to the delegation of power specified in
that decision. It shall take effect the day
following that of its publication in the
Official
Journal of the European Union or at a later date
specified therein. It shall not affect the validity
of any delegated acts already in force.
4.
As soon as it adopts a delegated act, the
Commission shall notify it simultaneously to the
European Parliament and to the Council.
Presidency compromise text
Drafting Suggestions
Comments
5.
Any delegated act adopted pursuant to
Article 4, Article 7(1), Article 11(3), Article
43(5) and (6) and Article 48(5) shall enter into
force only if no objection has been expressed by
either the European Parliament or the Council
within a period of three months of notification
of that act to the European Parliament and the
Council or if, before the expiry of that period,
the European Parliament and the Council have
both informed the Commission that they will
not object. That period shall be extended by
three months at the initiative of the European
Parliament or of the Council.
Article 74
Committee procedure
1.
The Commission shall be assisted by a
committee. That committee shall be a
committee within the meaning of Regulation
(EU) No 182/2011.
Presidency compromise text
Drafting Suggestions
Comments
2.
Where reference is made to this
paragraph, Article 5 of Regulation (EU) No
182/2011 shall apply.
TITLE XII
FINAL PROVISIONS
Article 75
Amendment to Regulation (EC) No 300/2008
In Article 4(3) of Regulation (EC) No 300/2008,
the following subparagraph is added:
“When adopting detailed measures related to
technical specifications and procedures for
approval and use of security equipment
concerning Artificial Intelligence systems in the
meaning of Regulation (EU) YYY/XX [on
Presidency compromise text
Drafting Suggestions
Comments
Artificial Intelligence] of the European
Parliament and of the Council*, the
requirements set out in Chapter 2, Title III of
that Regulation shall be taken into account.”
__________
* Regulation (EU) YYY/XX [on Artificial
Intelligence] (OJ …).”
Article 76
Amendment to Regulation (EU) No 167/2013
In Article 17(5) of Regulation (EU) No
167/2013, the following subparagraph is added:
“When adopting delegated acts pursuant to the
first subparagraph concerning artificial
intelligence systems which are safety
components in the meaning of Regulation (EU)
YYY/XX [on Artificial Intelligence] of the
Presidency compromise text
Drafting Suggestions
Comments
European Parliament and of the Council*, the
requirements set out in Title III, Chapter 2 of
that Regulation shall be taken into account.
__________
* Regulation (EU) YYY/XX [on Artificial
Intelligence] (OJ …).”
Article 77
Amendment to Regulation (EU) No 168/2013
In Article 22(5) of Regulation (EU) No
168/2013, the following subparagraph is added:
“When adopting delegated acts pursuant to the
first subparagraph concerning Artificial
Intelligence systems which are safety
components in the meaning of Regulation (EU)
YYY/XX on [Artificial Intelligence] of the
European Parliament and of the Council*, the
Presidency compromise text
Drafting Suggestions
Comments
requirements set out in Title III, Chapter 2 of
that Regulation shall be taken into account.
__________
* Regulation (EU) YYY/XX [on Artificial
Intelligence] (OJ …).”
Article 78
Amendment to Directive 2014/90/EU
In Article 8 of Directive 2014/90/EU, the
following paragraph is added:
“4. For Artificial Intelligence systems
which are
safety components in the meaning of Regulation
(EU) YYY/XX [on Artificial Intelligence] of the
European Parliament and of the Council*, when
carrying out its activities pursuant to paragraph
1 and when adopting technical specifications
and testing standards in accordance with
Presidency compromise text
Drafting Suggestions
Comments
paragraphs 2 and 3, the Commission shall take
into account the requirements set out in Title III,
Chapter 2 of that Regulation.
__________
* Regulation (EU) YYY/XX [on Artificial
Intelligence] (OJ …).”.
Article 79
Amendment to Directive (EU) 2016/797
In Article 5 of Directive (EU) 2016/797, the
following paragraph is added:
“12. When adopting delegated acts pursuant to
paragraph 1 and implementing acts pursuant to
paragraph 11 concerning Artificial Intelligence
systems
which are safety components in the
meaning of Regulation (EU) YYY/XX [on
Artificial Intelligence] of the European
Presidency compromise text
Drafting Suggestions
Comments
Parliament and of the Council*, the
requirements set out in Title III, Chapter 2 of
that Regulation shall be taken into account.
__________
* Regulation (EU) YYY/XX [on Artificial
Intelligence] (OJ …).”.
Article 80
Amendment to Regulation (EU) 2018/858
In Article 5 of Regulation (EU) 2018/858 the
following paragraph is added:
“4. When adopting delegated acts pursuant to
paragraph 3 concerning Artificial Intelligence
systems which are safety components in the
meaning of Regulation (EU) YYY/XX [on
Artificial Intelligence] of the European
Parliament and of the Council *, the
Presidency compromise text
Drafting Suggestions
Comments
requirements set out in Title III, Chapter 2 of
that Regulation shall be taken into account.
__________
* Regulation (EU) YYY/XX [on Artificial
Intelligence] (OJ …).”.
Article 81
Amendment to Regulation (EU) 2018/1139
Regulation (EU) 2018/1139 is amended as
follows:
(1) In Article 17, the following paragraph is
added:
“3. Without prejudice to paragraph 2, when
adopting implementing acts pursuant to
paragraph 1 concerning Artificial Intelligence
systems which are safety components in the
Presidency compromise text
Drafting Suggestions
Comments
meaning of Regulation (EU) YYY/XX [
on
Artificial Intelligence] of the European
Parliament and of the Council*, the
requirements set out in Title III, Chapter 2 of
that Regulation shall be taken into account.
__________
* Regulation (EU) YYY/XX [on Artificial
Intelligence] (OJ …).”
(2) In Article 19, the following paragraph is
added:
“4. When adopting delegated acts pursuant to
paragraphs 1 and 2 concerning Artificial
Intelligence systems which are safety
components in the meaning of Regulation (EU)
YYY/XX [on Artificial Intelligence], the
requirements set out in Title III, Chapter 2 of
that Regulation shall be taken into account.”
Presidency compromise text
Drafting Suggestions
Comments
(3) In Article 43, the following paragraph is
added:
“4. When adopting implementing acts pursuant
to paragraph 1 concerning Artificial Intelligence
systems which are safety components in the
meaning of Regulation (EU) YYY/XX [on
Artificial Intelligence], the requirements set out
in Title III, Chapter 2 of that Regulation shall be
taken into account.”
(4) In Article 47, the following paragraph is
added:
“3. When adopting delegated acts pursuant to
paragraphs 1 and 2 concerning Artificial
Intelligence systems which are safety
components in the meaning of Regulation (EU)
YYY/XX [on Artificial Intelligence], the
Presidency compromise text
Drafting Suggestions
Comments
requirements set out in Title III, Chapter 2 of
that Regulation shall be taken into account.”
(5) In Article 57, the following paragraph is
added:
“When adopting those implementing acts
concerning Artificial Intelligence systems which
are safety components in the meaning of
Regulation (EU) YYY/XX [on Artificial
Intelligence], the requirements set out in Title
III, Chapter 2 of that Regulation shall be taken
into account.”
(6) In Article 58, the following paragraph is
added:
“3. When adopting delegated acts pursuant to
paragraphs 1 and 2 concerning Artificial
Intelligence systems which are safety
components in the meaning of Regulation (EU)
Presidency compromise text
Drafting Suggestions
Comments
YYY/XX [on Artificial Intelligence] , the
requirements set out in Title III, Chapter 2 of
that Regulation shall be taken into account.”.
Article 82
Amendment to Regulation (EU) 2019/2144
In Article 11 of Regulation (EU) 2019/2144, the
following paragraph is added:
“3. When adopting the implementing acts
pursuant to paragraph 2, concerning artificial
intelligence systems which are safety
components in the meaning of Regulation (EU)
YYY/XX [on Artificial Intelligence] of the
European Parliament and of the Council*, the
requirements set out in Title III, Chapter 2 of
that Regulation shall be taken into account.
__________
Presidency compromise text
Drafting Suggestions
Comments
* Regulation (EU) YYY/XX [on Artificial
Intelligence] (OJ …).”.
Article 83
AI systems already placed on the market or put
into service
1.
This Regulation shall not apply to the AI
systems which are components of the large-
scale IT systems established by the legal acts
listed in Annex IX that have been placed on the
market or put into service before
[12 months
after the date of application of this Regulation
referred to in Article 85(2)], unless the
replacement or amendment of those legal acts
leads to a significant change in the design or
intended purpose of the AI system or AI
systems concerned.
The requirements laid down in this Regulation
shall be taken into account, where applicable, in
Presidency compromise text
Drafting Suggestions
Comments
the evaluation of each large-scale IT systems
established by the legal acts listed in Annex IX
to be undertaken as provided for in those
respective acts.
2.
This Regulation shall apply to the high-
risk AI systems, other than the ones referred to
in paragraph 1, that have been placed on the
market or put into service before [
date of
application of this Regulation referred to in
Article 85(2)], only if, from that date, those
systems are subject to significant changes in
their design or intended purpose.
Article 84
Evaluation and review
1.
The Commission shall assess the need for
amendment of the list in Annex III once a year
following the entry into force of this Regulation.
Presidency compromise text
Drafting Suggestions
Comments
1a. The Commission shall assess the need
for amendment of the list in Annex I every 24
months following the entry into force of this
Regulation and until the end of the period of
the delegation of power. The findings of that
assessment shall be presented to the
European Parliament and the Council.
1b. The Commission shall assess the need
for amendment of the list in Annex III every
24 months following the entry into force of
this Regulation and until the end of the
period of the delegation of power. The
findings of that assessment shall be presented
to the European Parliament and the Council.
2.
By [
three years after the date of
IE considers that a period of 4 years between the
application of this Regulation referred to in
submission of the Commission’s report on the
Article 85(2)] and every four years thereafter,
evaluation and review of the Regulation is quite
the Commission shall submit a report on the
long and would propose that this be either
Presidency compromise text
Drafting Suggestions
Comments
evaluation and review of this Regulation to the
reduced or an interim review be conducted to
European Parliament and to the Council. The
capture any weaknesses in the system at as early
reports shall be made public.
a stage as possible.
3.
The reports referred to in paragraph 2
shall devote specific attention to the following:
(a) the status of the financial and human
resources of the national competent authorities
in order to effectively perform the tasks
assigned to them under this Regulation;
(b) the state of penalties, and notably
administrative fines as referred to in Article
71(1), applied by Member States to
infringements of the provisions of this
Regulation.
4
.
Within [
three years after the date of
application of this Regulation referred to in
Article 85(2)] and every four years thereafter,
Presidency compromise text
Drafting Suggestions
Comments
the Commission shall evaluate the impact and
effectiveness of codes of conduct to foster the
application of the requirements set out in Title
III, Chapter 2 and possibly other additional
requirements for AI systems other than high-risk
AI systems.
5.
For the purpose of paragraphs 1 to 43 the
Board, the Member States and national
competent authorities shall provide the
Commission with information on its request.
6.
In carrying out the evaluations and
reviews referred to in paragraphs 1 to 4
3 the
Commission shall take into account the
positions and findings of the Board, of the
European Parliament, of the Council, and of
other relevant bodies or sources.
7.
The Commission shall, if necessary,
submit appropriate proposals to amend this
Presidency compromise text
Drafting Suggestions
Comments
Regulation, in particular taking into account
developments in technology and in the light of
the state of progress in the information society.
Article 85
Entry into force and application
1.
This Regulation shall enter into force on
the twentieth day following that of its
publication in the
Official Journal of the
European Union.
2.
This Regulation shall apply from [24
months following the entering into force of the
Regulation].
3.
By way of derogation from paragraph 2:
(a) Title III, Chapter 4 and Title VI shall
Why do the provision on notified bodies and
apply from [three months following the entry
governance structures apply from three months
into force of this Regulation];
but the provisions for penalties apply from 12
Presidency compromise text
Drafting Suggestions
Comments
months? We would propose that 12 months for
the development of both structures would be
more coherent.
(b) Article 71 shall apply from [twelve
months following the entry into force of this
Regulation].
This Regulation shall be binding in its entirety
and directly applicable in all Member States.
Done at Brussels,
For the European Parliament
For the
Council
The President
The President
ANNEX IV
TECHNICAL DOCUMENTATION referred
to in Article 11(1)
Presidency compromise text
Drafting Suggestions
Comments
The technical documentation referred to in
Article 11(1) shall contain at least the following
information, as applicable to the relevant AI
system:
1.
A general description of the AI system
including:
(a) its intended purpose, the person/s
developing the system the date and the version
of the system;
(b) how the AI system interacts or can be
used to interact with hardware or software that
is not part of the AI system itself, where
applicable;
(c) the versions of relevant software or
firmware and any requirement related to version
update;
Presidency compromise text
Drafting Suggestions
Comments
(d) the description of all forms in which the
AI system is placed on the market or put into
service;
(e) the description of hardware on which the
AI system is intended to run;
(f)
where the AI system is a component of
products, photographs or illustrations showing
external features, marking and internal layout of
those products;
(g) instructions of use for the user and, where
applicable installation instructions;
2.
A detailed description of the elements of
the AI system and of the process for its
development, including:
Presidency compromise text
Drafting Suggestions
Comments
(a) the methods and steps performed for the
development of the AI system, including, where
relevant, recourse to pre-trained systems or tools
provided by third parties and how these have
been used, integrated or modified by the
provider;
(b) the design specifications of the system,
namely the general logic of the AI system and
of the algorithms; the key design choices
including the rationale and assumptions made,
also with regard to persons or groups of persons
on which the system is intended to be used; the
main classification choices; what the system is
designed to optimise for and the relevance of the
different parameters; the decisions about any
possible trade-off made regarding the technical
solutions adopted to comply with the
requirements set out in Title III, Chapter 2;
Presidency compromise text
Drafting Suggestions
Comments
(c) the description of the system architecture
explaining how software components build on
or feed into each other and integrate into the
overall processing; the computational resources
used to develop, train, test and validate the AI
system;
(d) where relevant, the data requirements in
terms of datasheets describing the training
methodologies and techniques and the training
data sets used, including information about the
provenance of those data sets, their scope and
main characteristics; how the data was obtained
and selected; labelling procedures (e.g. for
supervised learning), data cleaning
methodologies (e.g. outliers detection);
(e) assessment of the human oversight
measures needed in accordance with Article 14,
including an assessment of the technical
measures needed to facilitate the interpretation
Presidency compromise text
Drafting Suggestions
Comments
of the outputs of AI systems by the users, in
accordance with Articles 13(3)(d);
(f)
where applicable, a detailed description of
pre-determined changes to the AI system and
its performance, together with all the relevant
information related to the technical solutions
adopted to ensure continuous compliance of the
AI system with the relevant requirements set out
in Title III, Chapter 2;
(g) the validation and testing procedures used,
including information about the validation and
testing data used and their main characteristics;
metrics used to measure accuracy, robustness,
cybersecurity and compliance with other
relevant requirements set out in Title III,
Chapter 2 as well as potentially discriminatory
impacts; test logs and all test reports dated and
signed by the responsible persons, including
Presidency compromise text
Drafting Suggestions
Comments
with regard to pre-determined changes as
referred to under point (f).
3.
Detailed information about the
monitoring, functioning and control of the AI
system, in particular with regard to: its
capabilities and limitations in performance,
including the degrees of accuracy for specific
persons or groups of persons on which the
system is intended to be used and the overall
expected level of accuracy in relation to its
intended purpose; the foreseeable unintended
outcomes and sources of risks to health and
safety, fundamental rights and discrimination in
view of the intended purpose of the AI system;
the human oversight measures needed in
accordance with Article 14, including the
technical measures put in place to facilitate the
interpretation of the outputs of AI systems by
the users; specifications on input data, as
appropriate;
Presidency compromise text
Drafting Suggestions
Comments
4.
A detailed description of the risk
management system in accordance with Article
9;
5.
A description of any change made to the
system through its lifecycle;
6.
A list of the harmonised standards applied
in full or in part the references of which have
been published in the Official Journal of the
European Union; where no such harmonised
standards have been applied, a detailed
description of the solutions adopted to meet the
requirements set out in Title III, Chapter 2,
including a list of other relevant standards and
technical specifications applied;
7.
A copy of the EU declaration of
conformity;
Presidency compromise text
Drafting Suggestions
Comments
8.
A detailed description of the system in
place to evaluate the AI system performance in
the post-market phase in accordance with
Article 61, including the post-market monitoring
plan referred to in Article 61(3).
ANNEX V
EU DECLARATION OF CONFORMITY
The EU declaration of conformity referred to in
Article 48, shall contain all of the following
information:
1.
AI system name and type and any
additional unambiguous reference allowing
identification and traceability of the AI system;
2.
Name and address of the provider or,
where applicable, their authorised
representative;
Presidency compromise text
Drafting Suggestions
Comments
3.
A statement that the EU declaration of
conformity is issued under the sole
responsibility of the provider;
4.
A statement that the AI system in question
is in conformity with this Regulation and, if
applicable, with any other relevant Union
legislation that provides for the issuing of an EU
declaration of conformity;
5.
References to any relevant harmonised
standards used or any other common
specification in relation to which conformity is
declared;
6.
Where applicable, the name and
identification number of the notified body, a
description of the conformity assessment
procedure performed and identification of the
certificate issued;
Presidency compromise text
Drafting Suggestions
Comments
7.
Place and date of issue of the declaration,
name and function of the person who signed it
as well as an indication for, and on behalf of
whom, that person signed, signature.
ANNEX VI
CONFORMITY ASSESSMENT
PROCEDURE BASED ON INTERNAL
CONTROL
1.
The conformity assessment procedure
based on internal control is the conformity
assessment procedure based on points 2 to 4.
2.
The provider verifies that the established
quality management system is in compliance
with the requirements of Article 17.
3.
The provider examines the information
contained in the technical documentation in
order to assess the compliance of the AI system
Presidency compromise text
Drafting Suggestions
Comments
with the relevant essential requirements set out
in Title III, Chapter 2.
4.
The provider also verifies that the design
and development process of the AI system and
its post-market monitoring as referred to in
Article 61 is consistent with the technical
documentation.
ANNEX VII
CONFORMITY BASED ON ASSESSMENT
OF QUALITY MANAGEMENT SYSTEM
AND ASSESSMENT OF TECHNICAL
DOCUMENTATION
1.
Introduction
Conformity based on assessment of quality
management system and assessment of the
technical documentation is the conformity
assessment procedure based on points 2 to 5.
Presidency compromise text
Drafting Suggestions
Comments
2.
Overview
The approved quality management system for
the design, development and testing of AI
systems pursuant to Article 17 shall be
examined in accordance with point 3 and shall
be subject to surveillance as specified in point 5.
The technical documentation of the AI system
shall be examined in accordance with point 4.
3.
Quality management system
3.1. The application of the provider shall
include:
(a) the name and address of the provider and,
if the application is lodged by the authorised
representative, their name and address as well;
Presidency compromise text
Drafting Suggestions
Comments
(b) the list of AI systems covered under the
same quality management system;
(c) the technical documentation for each AI
system covered under the same quality
management system;
(d) the documentation concerning the quality
management system which shall cover all the
aspects listed under Article 17;
(e) a description of the procedures in place to
ensure that the quality management system
remains adequate and effective;
(f)
a written declaration that the same
application has not been lodged with any other
notified body.
3.2. The quality management system shall be
assessed by the notified body, which shall
Presidency compromise text
Drafting Suggestions
Comments
determine whether it satisfies the requirements
referred to in Article 17.
The decision shall be notified to the provider or
its authorised representative.
The notification shall contain the conclusions of
the assessment of the quality management
system and the reasoned assessment decision.
3.3. The quality management system as
approved shall continue to be implemented and
maintained by the provider so that it remains
adequate and efficient.
3.4. Any intended change to the approved
quality management system or the list of AI
systems covered by the latter shall be brought to
the attention of the notified body by the
provider.
Presidency compromise text
Drafting Suggestions
Comments
The proposed changes shall be examined by the
notified body, which shall decide whether the
modified quality management system continues
to satisfy the requirements referred to in point
3.2 or whether a reassessment is necessary.
The notified body shall notify the provider of its
decision. The notification shall contain the
conclusions of the examination of the changes
and the reasoned assessment decision.
4.
Control of the technical documentation.
4.1. In addition to the application referred to in
point 3, an application with a notified body of
their choice shall be lodged by the provider for
the assessment of the technical documentation
relating to the AI system which the provider
intends to place on the market or put into
service and which is covered by the quality
management system referred to under point 3.
Presidency compromise text
Drafting Suggestions
Comments
4.2. The application shall include:
(a) the name and address of the provider;
(b) a written declaration that the same
application has not been lodged with any other
notified body;
(c) the technical documentation referred to in
Annex IV.
4.3. The technical documentation shall be
examined by the notified body. To this purpose,
the notified body shall be granted full access to
the training and testing datasets used by the
provider, including through application
programming interfaces (API) or other
appropriate means and tools enabling remote
access.
Presidency compromise text
Drafting Suggestions
Comments
4.4. In examining the technical documentation,
the notified body may require that the provider
supplies further evidence or carries out further
tests so as to enable a proper assessment of
conformity of the AI system with the
requirements set out in Title III, Chapter 2.
Whenever the notified body is not satisfied with
the tests carried out by the provider, the notified
body shall directly carry out adequate tests, as
appropriate.
4.5. Where necessary to assess the conformity
of the high-risk AI system with the requirements
set out in Title III, Chapter 2 and upon a
reasoned request, the notified body shall also be
granted access to the source code of the AI
system.
4.6. The decision shall be notified to the
provider or its authorised representative. The
notification shall contain the conclusions of the
Presidency compromise text
Drafting Suggestions
Comments
assessment of the technical documentation and
the reasoned assessment decision.
Where the AI system is in conformity with the
requirements set out in Title III, Chapter 2, an
EU technical documentation assessment
certificate shall be issued by the notified body.
The certificate shall indicate the name and
address of the provider, the conclusions of the
examination, the conditions (if any) for its
validity and the data necessary for the
identification of the AI system.
The certificate and its annexes shall contain all
relevant information to allow the conformity of
the AI system to be evaluated, and to allow for
control of the AI system while in use, where
applicable.
Where the AI system is not in conformity with
the requirements set out in Title III, Chapter 2,
Presidency compromise text
Drafting Suggestions
Comments
the notified body shall refuse to issue an EU
technical documentation assessment certificate
and shall inform the applicant accordingly,
giving detailed reasons for its refusal.
Where the AI system does not meet the
requirement relating to the data used to train it,
re-training of the AI system will be needed prior
to the application for a new conformity
assessment. In this case, the reasoned
assessment decision of the notified body
refusing to issue the EU technical
documentation assessment certificate shall
contain specific considerations on the quality
data used to train the AI system, notably on the
reasons for non-compliance.
4.7. Any change to the AI system that could
affect the compliance of the AI system with the
requirements or its intended purpose shall be
approved by the notified body which issued the
Presidency compromise text
Drafting Suggestions
Comments
EU technical documentation assessment
certificate. The provider shall inform such
notified body of its intention to introduce any of
the above-mentioned changes or if it becomes
otherwise aware of the occurrence of such
changes. The intended changes shall be assessed
by the notified body which shall decide whether
those changes require a new conformity
assessment in accordance with Article 43(4) or
whether they could be addressed by means of a
supplement to the EU technical documentation
assessment certificate. In the latter case, the
notified body shall assess the changes, notify the
provider of its decision and, where the changes
are approved, issue to the provider a supplement
to the EU technical documentation assessment
certificate.
5.
Surveillance of the approved quality
management system.
Presidency compromise text
Drafting Suggestions
Comments
5.1. The purpose of the surveillance carried
out by the notified body referred to in Point 3 is
to make sure that the provider duly fulfils the
terms and conditions of the approved quality
management system.
5.2. For assessment purposes, the provider
shall allow the notified body to access the
premises where the design, development, testing
of the AI systems is taking place. The provider
shall further share with the notified body all
necessary information.
5.3. The notified body shall carry out periodic
audits to make sure that the provider maintains
and applies the quality management system and
shall provide the provider with an audit report.
In the context of those audits, the notified body
may carry out additional tests of the AI systems
for which an EU technical documentation
assessment certificate was issued.
Presidency compromise text
Drafting Suggestions
Comments
ANNEX VIII
INFORMATION TO BE SUBMITTED
UPON THE REGISTRATION OF HIGH-
RISK AI SYSTEMS IN ACCORDANCE
WITH ARTICLE 51
The following information shall be provided and
thereafter kept up to date with regard to high-
risk AI systems to be registered in accordance
with Article 51.
1.
Name, address and contact details of the
provider;
2.
Where submission of information is
carried out by another person on behalf of the
provider, the name, address and contact details
of that person;
Presidency compromise text
Drafting Suggestions
Comments
3.
Name, address and contact details of the
authorised representative, where applicable;
4.
AI system trade name and any additional
unambiguous reference allowing identification
and traceability of the AI system;
5.
Description of the intended purpose of the
AI system;
6.
Status of the AI system (on the market, or
in service; no longer placed on the market/in
service, recalled);
7.
Type, number and expiry date of the
certificate issued by the notified body and the
name or identification number of that notified
body, when applicable;
8.
A scanned copy of the certificate referred
to in point 7, when applicable;
Presidency compromise text
Drafting Suggestions
Comments
9.
Member States in which the AI system is
or has been placed on the market, put into
service or made available in the Union;
10. A copy of the EU declaration of
conformity referred to in Article 48;
11. Electronic instructions for use; this
information shall not be provided for high-risk
AI systems in the areas of law enforcement and
migration, asylum and border control
management referred to in Annex III, points 1, 6
and 7.
12. URL for additional information (optional).
ANNEX IX
UNION LEGISLATION ON LARGE-
SCALE IT SYSTEMS IN THE AREA OF
FREEDOM, SECURITY AND JUSTICE
Presidency compromise text
Drafting Suggestions
Comments
1.
Schengen Information System
(a) Regulation (EU) 2018/1860 of the
European Parliament and of the Council of 28
November 2018 on the use of the Schengen
Information System for the return of illegally
staying third-country nationals (OJ L 312,
7.12.2018, p. 1).
(b) Regulation (EU) 2018/1861 of the
European Parliament and of the Council of 28
November 2018 on the establishment, operation
and use of the Schengen Information System
(SIS) in the field of border checks, and
amending the Convention implementing the
Schengen Agreement, and amending and
repealing Regulation (EC) No 1987/2006 (OJ L
312, 7.12.2018, p. 14)
Presidency compromise text
Drafting Suggestions
Comments
(c) Regulation (EU) 2018/1862 of the
European Parliament and of the Council of 28
November 2018 on the establishment, operation
and use of the Schengen Information System
(SIS) in the field of police cooperation and
judicial cooperation in criminal matters,
amending and repealing Council Decision
2007/533/JHA, and repealing Regulation (EC)
No 1986/2006 of the European Parliament and
of the Council and Commission Decision
2010/261/EU (OJ L 312, 7.12.2018, p. 56).
2.
Visa Information System
(a) Proposal for a REGULATION OF THE
EUROPEAN PARLIAMENT AND OF THE
COUNCIL amending Regulation (EC) No
767/2008, Regulation (EC) No 810/2009,
Regulation (EU) 2017/2226, Regulation (EU)
2016/399, Regulation XX/2018 [Interoperability
Regulation], and Decision 2004/512/EC and
Presidency compromise text
Drafting Suggestions
Comments
repealing Council Decision 2008/633/JHA -
COM(2018) 302 final. To be updated once the
Regulation is adopted (April/May 2021) by the
co-legislators.
3.
Eurodac
(a) Amended proposal for a REGULATION
OF THE EUROPEAN PARLIAMENT AND
OF THE COUNCIL on the establishment of
'Eurodac' for the comparison of biometric data
for the effective application of Regulation (EU)
XXX/XXX [Regulation on Asylum and
Migration Management] and of Regulation (EU)
XXX/XXX [Resettlement Regulation], for
identifying an illegally staying third-country
national or stateless person and on requests for
the comparison with Eurodac data by Member
States' law enforcement authorities and Europol
for law enforcement purposes and amending
Presidency compromise text
Drafting Suggestions
Comments
Regulations (EU) 2018/1240 and (EU)
2019/818 – COM(2020) 614 final.
4.
Entry/Exit System
(a) Regulation (EU) 2017/2226 of the
European Parliament and of the Council of 30
November 2017 establishing an Entry/Exit
System (EES) to register entry and exit data and
refusal of entry data of third-country nationals
crossing the external borders of the Member
States and determining the conditions for access
to the EES for law enforcement purposes, and
amending the Convention implementing the
Schengen Agreement and Regulations (EC) No
767/2008 and (EU) No 1077/2011 (OJ L 327,
9.12.2017, p. 20).
5.
European Travel Information and
Authorisation System
Presidency compromise text
Drafting Suggestions
Comments
(a) Regulation (EU) 2018/1240 of the
European Parliament and of the Council of 12
September 2018 establishing a European Travel
Information and Authorisation System (ETIAS)
and amending Regulations (EU) No 1077/2011,
(EU) No 515/2014, (EU) 2016/399, (EU)
2016/1624 and (EU) 2017/2226 (OJ L 236,
19.9.2018, p. 1).
(b) Regulation (EU) 2018/1241 of the
European Parliament and of the Council of 12
September 2018 amending Regulation (EU)
2016/794 for the purpose of establishing a
European Travel Information and Authorisation
System (ETIAS) (OJ L 236, 19.9.2018, p. 72).
6.
European Criminal Records Information
System on third-country nationals and stateless
persons
Presidency compromise text
Drafting Suggestions
Comments
(a) Regulation (EU) 2019/816 of the
European Parliament and of the Council of 17
April 2019 establishing a centralised system for
the identification of Member States holding
conviction information on third-country
nationals and stateless persons (ECRIS-TCN) to
supplement the European Criminal Records
Information System and amending Regulation
(EU) 2018/1726 (OJ L 135, 22.5.2019, p. 1).
7.
Interoperability
(a) Regulation (EU) 2019/817 of the
European Parliament and of the Council of 20
May 2019 on establishing a framework for
interoperability between EU information
systems in the field of borders and visa (OJ L
135, 22.5.2019, p. 27).
(b) Regulation (EU) 2019/818 of the
European Parliament and of the Council of 20
Presidency compromise text
Drafting Suggestions
Comments
May 2019 on establishing a framework for
interoperability between EU information
systems in the field of police and judicial
cooperation, asylum and migration (OJ L 135,
22.5.2019, p. 85).
End
End
Document Outline