36 Amendments of Róża THUN UND HOHENSTEIN related to 2021/0106(COD)
Amendment 450 #
Proposal for a regulation
Recital 18
Recital 18
(18) The use of AI systems for ‘real- time’ remote biometric identification of natural persons in publicly accessible spaces for the purpose of law enforcement is considered particularly intrusive in the rights and freedoms of the concerned persons, to the extent that it may affect the private life of a large part of the population, evoke a feeling of constant surveillance and indirectly dissuade the exercise of the freedom of assembly and other fundamental rights. In addition, the immediacy of the impact and the limited opportunities for further checks or corrections in relation to the use of such systems operating in ‘real-time’ carry heightened risks for the rights and freedoms of the persons that are concerned by law enforcement activities. The use of those systems in publicly accessible places should therefore be prohibited.
Amendment 464 #
Proposal for a regulation
Recital 19
Recital 19
Amendment 477 #
Proposal for a regulation
Recital 20
Recital 20
Amendment 486 #
Proposal for a regulation
Recital 21
Recital 21
Amendment 494 #
Proposal for a regulation
Recital 22
Recital 22
Amendment 497 #
Proposal for a regulation
Recital 23
Recital 23
Amendment 511 #
Proposal for a regulation
Recital 24
Recital 24
(24) Any processing of biometric data and other personal data involved in the use of AI systems for biometric identification, other than in connection to the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement as regulated by this Regulation, including where those systems are used by competent authorities in publicly accessible spaces for other purposes than law enforcement, should continue to comply with all requirements resulting from Article 9(1) of Regulation (EU) 2016/679, Article 10(1) of Regulation (EU) 2018/1725 and Article 10 of Directive (EU) 2016/680, as applicable.
Amendment 663 #
Proposal for a regulation
Recital 58
Recital 58
(58) Given the nature of AI systems and the risks to safety and fundamental rights possibly associated with their use, including as regard the need to ensure proper monitoring of the performance of an AI system in a real-life setting, it is appropriate to set specific responsibilities for users. Users should in particular use high-risk AI systems in accordance with the instructions of use and certain other obligations should be provided for with regard to monitoring of the functioning of the AI systems and with regard to record- keeping, as appropriate. Given the potential impact and the need for democratic oversight and scrutiny, users of high-risk AI systems that are public authorities or Union institutions, bodies, offices and agencies should be required to conduct a fundamental rights impact assessment prior to commencing the use of a high-risk AI system should be required to register the use of any high- risk AI systems in a public database.
Amendment 1037 #
Proposal for a regulation
Article 3 – paragraph 1 – point 34
Article 3 – paragraph 1 – point 34
(34) ‘emotion recognition system’ means an AI system for the purpose of identifying or inferring emotions, thoughts or intentions of natural persons on the basis of their biometric or biometrics-based data;
Amendment 1044 #
Proposal for a regulation
Article 3 – paragraph 1 – point 35
Article 3 – paragraph 1 – point 35
(35) ‘biometric categorisation system’ means an AI system for the purpose of assigning natural persons to specific categories, such as sex, age, hair colour, eye colour, tattoos, ethnic origin or sexual or political orientation, or inferring their characteristics and attributes on the basis of their biometric or biometrics-based data;
Amendment 1046 #
Proposal for a regulation
Article 3 – paragraph 1 – point 35
Article 3 – paragraph 1 – point 35
(35) ‘biometric categorisation system’ means an AI system for that uses biometric data, or other purposhysical, physiological or behavioral data, capable of assigning natural persons to specific categories, such as sex, age, hair colour, eye colour, tattoos, ethnic origin or sexual or political orientation, on the basis of their biometric data or inferring their characteristics and attributes;
Amendment 1061 #
Proposal for a regulation
Article 3 – paragraph 1 – point 36
Article 3 – paragraph 1 – point 36
(36) ‘remote biometric identification system’ means an AI system for the purpose of identifycapable of categorizing natural persons at a distance through the comparison of a person’s biometric data with the biometric data contained in a reference database, and without prior knowledge of the user of the AI system whether the person will be present and can be identified or other physical, physiological or behavioral data, with this data contained in a reference database;
Amendment 1233 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – introductory part
Article 5 – paragraph 1 – point d – introductory part
(d) the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement, unless and in as far as such use is strictly necessary for one of the following objectives:
Amendment 1234 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – introductory part
Article 5 – paragraph 1 – point d – introductory part
(d) the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement, unless and in as far as such use is strictly necessary for one of the following objectives:.
Amendment 1235 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – introductory part
Article 5 – paragraph 1 – point d – introductory part
(d) tThe use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement, unless and in as far as such use is strictly necessary for one of the following objectives:placing on the market, putting into service or use of of AI for an automated recognition of human features in publicly accessible spaces - such as of faces but also of gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioral signals - for any purpose, including law enforcement.
Amendment 1253 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – point i
Article 5 – paragraph 1 – point d – point i
Amendment 1263 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – point ii
Article 5 – paragraph 1 – point d – point ii
Amendment 1287 #
Proposal for a regulation
Article 5 – paragraph 1 – point d a (new)
Article 5 – paragraph 1 – point d a (new)
(d a) The creation or expansion of facial recognition or other biometric databases through the untargeted scraping of biometric data from social media profiles or CCTV footage or equivalent methods;
Amendment 1296 #
Proposal for a regulation
Article 5 – paragraph 1 – point d b (new)
Article 5 – paragraph 1 – point d b (new)
(d b) The use of private facial recognition or other private biometric databases for the purpose of law enforcement
Amendment 1299 #
Proposal for a regulation
Article 5 – paragraph 1 – point d c (new)
Article 5 – paragraph 1 – point d c (new)
(d c) The placing on the market, putting into service or use of 'emotion recognition systems', unless for health purposes, which would be considered high risk.Emotion recognition systems for health purposes shall be limited to their intended purpose, subject to all applicable data protection conditions and limits, and: (i) undergo strict testing to ensure scientific and clinical validity; (ii) contain clear advice to anyone that may procure or use them about the limitations of such technologies and their potential risks, including of flawed or potentially harmful outcomes; (iii) be developed with the active participation and input of the groups they are intended to benefit, as well as those with expertise in the range of fundamental rights that could be deliberately or inadvertently impacted; (iv) be developed and deployed in a manner that respects the rights of all persons likely to be affected by them; (v) be subject to an opinion of the Health Security Committee and the Fundamental Rights Agency.
Amendment 1311 #
Proposal for a regulation
Article 5 – paragraph 1 – point d d (new)
Article 5 – paragraph 1 – point d d (new)
(d d) AI systems intended to be used by law enforcement authorities as polygraphs and similar tools or to detect the emotional state of a natural person;
Amendment 1317 #
Proposal for a regulation
Article 5 – paragraph 1 – point d f (new)
Article 5 – paragraph 1 – point d f (new)
(d f) the placing on the market, putting into service or use of AI systems that use psysiological, behavioural or biometric data to infer attributes or characteristics of persons or groups which are not solely determined by such data or are not externally observable or whose complexity is not possible to fully capture in data, including but not limited to gender, race, colour, ethnic or social origin, as well as political or sexual orientation, or other grounds for discrimination prohibited under Article 21 of the Charter.
Amendment 1325 #
Proposal for a regulation
Article 5 – paragraph 1 – point d g (new)
Article 5 – paragraph 1 – point d g (new)
(d g) AI systems intended to be used by public authorities or on behalf of public authorities to evaluate the eligibility of natural persons for public assistance benefits and services, as well as to grant, reduce, revoke, or reclaim such benefits and services;
Amendment 1354 #
Proposal for a regulation
Article 5 – paragraph 2
Article 5 – paragraph 2
Amendment 1356 #
Proposal for a regulation
Article 5 – paragraph 2 – point a
Article 5 – paragraph 2 – point a
Amendment 1358 #
Proposal for a regulation
Article 5 – paragraph 2 – point b
Article 5 – paragraph 2 – point b
Amendment 1361 #
Proposal for a regulation
Article 5 – paragraph 2 – subparagraph 1
Article 5 – paragraph 2 – subparagraph 1
Amendment 1367 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
Amendment 1375 #
Proposal for a regulation
Article 5 – paragraph 3 – subparagraph 1
Article 5 – paragraph 3 – subparagraph 1
Amendment 1387 #
Proposal for a regulation
Article 5 – paragraph 4
Article 5 – paragraph 4
Amendment 1420 #
Proposal for a regulation
Article 6 – paragraph 1 – point a
Article 6 – paragraph 1 – point a
(a) the AI system is intended to be used or reasonably foreseeable used as a safety component of a product, or is itself a product, covered by the Union harmonisation legislation listed in Annex II;
Amendment 2062 #
Proposal for a regulation
Article 29 – paragraph 5 a (new)
Article 29 – paragraph 5 a (new)
5 a. Users of high-risk AI systems that are public authorities or Union institutions, bodies, offices and agencies shall conduct a fundamental rights impact assessment prior to commencing the use of a high-risk AI system;
Amendment 2950 #
Proposal for a regulation
Article 83 – paragraph 1 – subparagraph 1
Article 83 – paragraph 1 – subparagraph 1
Amendment 2961 #
Proposal for a regulation
Article 83 – paragraph 2
Article 83 – paragraph 2
2. This Regulation shall apply to the high-risk AI systems, other than the ones referred to in paragraph 1, that have been placed on the market or put into service before [date of application of this Regulation referred to in Article 85(2)], only if, from that date, those systems are subject to significant changes in their design or intended purposewith a transitional period of two years after the application of this Regulation.
Amendment 3163 #
Proposal for a regulation
Annex III – paragraph 1 – point 6 – point b
Annex III – paragraph 1 – point 6 – point b
(b) AI systems intended to be used by law enforcement authorities or on behalf of law enforcement authorities as polygraphs and similar tools or to detect the emotional state of a natural person;
Amendment 3294 #
Proposal for a regulation
Annex VIII – paragraph 1 a (new)
Annex VIII – paragraph 1 a (new)
1a.The following information shall be provided and updated with regard to high risk AI systems to be registered in accordance with Article 51(2) by users who are or act on behalf of public authorities or Union institutions, bodies, offices or agencies: 1. the name, address and contact details of the user; 2. the name, address and contact details of any person submitting information on behalf of the user; 3. the high-risk AI system trade name and any additional unambiguous reference allowing identification and traceability of the AI system used; 4. description of the intended use of the AI system, including the specific outcomes sought through the use of the system; 5. a summary of the findings of the fundamental rights impact assessment conducted in accordance with the obligation of public authorities or Union institutions, agencies, offices or bodies set out in this Regulation; 6. a summary of the data protection impact assessment carried out in accordance with Article 35 of Regulation (EU) 2016/679 or Article 27 of Directive (EU) 2016/680 as specified in paragraph 6 of Article 29 of this Regulation, where applicable; 6. a declaration of conformity with the applicable data protection rules.