13 Amendments of Paul TANG related to 2020/2016(INI)
Amendment 3 #
Motion for a resolution
Citation 2
Citation 2
— having regard to the Charter of Fundamental Rights of the European Union, in particular Articles 7 and 8
Amendment 36 #
Motion for a resolution
Recital C a (new)
Recital C a (new)
C a. whereas AI systems always have to be in the service of humans and have the ultimate safety valve of being designed so that they can always be shut down by a human operator;
Amendment 52 #
Motion for a resolution
Recital F b (new)
Recital F b (new)
F b. whereas the relationship between protecting fundamental rights and effective policing must always be an essential element in the discussions on whether and how AI should be used by law enforcement sector, where decisions may have long lasting consequences on the life and freedom of individuals;
Amendment 85 #
Motion for a resolution
Paragraph 1 a (new)
Paragraph 1 a (new)
1 a. recalls that the EU has already established data protection standards for law enforcement, which form the foundation for any future regulation in AI; recalls that processing of personal data must be lawful and fair; the purposes of processing must be specified, explicit and legitimate; must be adequate, relevant and not excessive in relation to the purpose for which is it processed; be accurate and kept up to date (inaccurate data should, subject to the purpose for which it would otherwise be retained, be correcte dor erased); should be kept for no longer than is necessary and processed in a secure manner;
Amendment 116 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Highlights the importance of preventing mass surveillance by means of AI technologies, in particular face recognition, and of banning applications that would result in it;
Amendment 143 #
Motion for a resolution
Paragraph 8 a (new)
Paragraph 8 a (new)
8 a. stresses that only a robust European AI governance enable the necessary operationalisation of fundamental rights principles;
Amendment 145 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Considers it necessary to create a clear and fair regime for assigning legal responsibility for the potential adverse consequences produced by these advanced digital technologies; Recognises the challenges to correctly locate the responsibility for potential harm, given the complexity of development and operation of AI systems;
Amendment 149 #
Motion for a resolution
Paragraph 9 a (new)
Paragraph 9 a (new)
9 a. Highlights how individuals have become overly trusting in the seemingly objective and scientific nature of AI tools and thus fail to consider the possibility of their results being incorrect, incomplete or irrelevant, with potentially grave adverse consequences specifically in the area of law enforcement and justice; Emphasises the over-reliance on the results provided for by AI systems, and notes with concern the lack of confidence and knowledge, by authorities, to question or override an algorithmic recommendation;
Amendment 164 #
Motion for a resolution
Paragraph 11 a (new)
Paragraph 11 a (new)
11 a. Calls for, in order to guarantee the algorithmic explainability and transparency of law enforcement AI systems, only such tools to be allowed to be purchased by the law enforcement in the Union, which algorithms and logic are open, to at least the police forces themselves, that can be audited, evaluated and vetted by them, and not closed and labelled proprietary by the vendors;
Amendment 166 #
Motion for a resolution
Paragraph 11 b (new)
Paragraph 11 b (new)
11 b. considers that the use and collection of any biometric data for remote identification purposes, for example by conducting facial recognition in public places, as well as at automatic border control gates used for border checks at airports, may pose specific risks to fundamental rights; the implications of which could vary considerably depending on the purpose, context and scope of use;
Amendment 171 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Calls for a compulsory fundamental rights impact assessment to be conducted prior to the implementation or deployment of any AI systems for law enforcement or judiciary, in order to assess any potential risks to fundamental rights; Underlines that this could oftentimes be built upon the mandatory Data Protection Impact Assessments;
Amendment 184 #
Motion for a resolution
Paragraph 15
Paragraph 15
15. Calls for a permanent prohibition on the use of facial recognition systems in the public space and in premises meant for education and (health) care and a moratorium on the deployment of facial recognition systems for law enforcement in semi-public spaces, such as airports, until the technical standards can be considered fully fundamental rights compliant, results derived are non-discriminatory, and there is public trust in the necessity and proportionality for the deployment of such technologies;
Amendment 198 #
Motion for a resolution
Paragraph 16 a (new)
Paragraph 16 a (new)
16 a. Calls for the Fundamental Rights Agency, in collaboration with the European Data Protection Board and the European Data Protection Supervisor to draft comprehensive guidelines for the development, use and deployment of AI applications and solutions for the use by law enforcement and judicial authorities;