37 Amendments of Svenja HAHN related to 2022/0155(COD)
Amendment 192 #
Proposal for a regulation
Recital 17 a (new)
Recital 17 a (new)
(17 a) Fundamental rights in the digital sphere have to be guaranteed to the same extent as in the offline world. Safety and privacy need to be ensured, amongst others through end-to-end encryption in private online communication and the protection of private content against any kind of general or targeted surveillance, be it by public or private actors.
Amendment 274 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Article 2 a End-to-End Encryption and Prohibition on General Monitoring 1. End-to-end encryption is essential to guarantee the security, confidentiality of the communications of users, including those of children. Any restrictions of encryption could lead to abuse by malicious actors. Nothing in this Regulation should be interpreted as prohibiting providers of information society services from providing their services applying end-to-end encryption, restricting or undermining such encryption. Member States should not prevent providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third party access. 2. Nothing in this Regulation should undermine the prohibition of general monitoring under EU law.
Amendment 280 #
Proposal for a regulation
–
–
The European Parliament rejects the Commission proposal (COM(2022)0209).
Amendment 287 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
Article 3 – paragraph 2 – point b – indent 3
— functionalities enabling age verificationprotection of children and preventing online child sexual abuse;
Amendment 323 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include, but need not to be limited to, some or all of the following:
Amendment 349 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
Amendment 373 #
Proposal for a regulation
Article 5 – paragraph 6 a (new)
Article 5 – paragraph 6 a (new)
6 a. Providers of hosting services and providers of interpersonal communications services that qualify as micro (or small) enterprises within the meaning of Article 3 of Directive 2013/34/EU shall transmit a simplified version of the report under paragraph 1 of this Article.
Amendment 374 #
Proposal for a regulation
Article 5 a (new)
Article 5 a (new)
Amendment 375 #
Proposal for a regulation
Article 6
Article 6
Amendment 390 #
Proposal for a regulation
Chapter II – Section 2
Chapter II – Section 2
Amendment 392 #
Proposal for a regulation
Article 7
Article 7
Amendment 425 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
(b a) The voluntary measures applied as mitigating measures have not proven successful in preventing the misuse of the service for child sexual abuse.
Amendment 458 #
Amendment 467 #
Proposal for a regulation
Article 8 – paragraph 1 – point c
Article 8 – paragraph 1 – point c
(c) the name of the provider and, where applicable, its legal representative, without prejudice to the issuance of detection orders where the legal name of the provider is not readily ascertained;
Amendment 485 #
Proposal for a regulation
Article 9
Article 9
Amendment 496 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market. by persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
Amendment 503 #
Proposal for a regulation
Article 10
Article 10
Amendment 515 #
Proposal for a regulation
Article 10 – paragraph 4 – point a
Article 10 – paragraph 4 – point a
(a) take all the necessary and proportionate measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly limited to what is necessary to execute the detection orders addressed to them;
Amendment 530 #
Proposal for a regulation
Article 11
Article 11
Amendment 535 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
1 a. Where a provider of hosting services or a provider of interpersonal communications services receives a report by the public through, among others, trusted hotline, it shall process and analyse the report in a timely and effective manner as to assess an imminent risk of miuse of the service for child child sexual abuse, without prejudice to the obligation to report to the EU centre pursuant paragraph 1.
Amendment 539 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2 a. The report submitted by the provider pursuant paragrah 2, shall never contain information about the source of the report, especially when this stems from the person to whom the material relates.
Amendment 553 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting child sexual abuse materialRemoval orders shall be issued by judicial authorities in line with Article 9 on Orders to act against illegal content of the Regulation (EU) 2022/2065.
Amendment 556 #
Proposal for a regulation
Article 14 – paragraph 2
Article 14 – paragraph 2
Amendment 594 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) “person suspected of being involved in child sexual abuse” means an identified individual person about whom verifiable adequate evidence exists, which gives rise to the suspicion that that person has committed a child sexual abuse offence, attempted to commit a child sexual abuse offence, or prepared by committing a criminal offence to commit a child sexual abuse offence;
Amendment 596 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
Article 2 – paragraph 1 – point q b (new)
(qb) 'person disqualified from exercising activities involving children' means an identified individual person, who, in line with Article 10 of Directive 2011/93/EU, is temporarily or permanenently disqualified from exercising activities involving direct and regular contacts with children;
Amendment 670 #
Proposal for a regulation
Article 85 – paragraph 1
Article 85 – paragraph 1
1. By [five years after the entry into force of this Regulation], and every five years thereafter, the Commission shall evaluate this Regulation and submit a report on its application to the European Parliament and the Council. This report shall address in particular the possible use of new technologies for a safe and trusted processing of personal and other data and for the purpose of combating online child sexual abuse and in particular to detect, report and remove online child sexual abuse. The report shall be accompanied, where appropriate, by a legislative proposal.
Amendment 807 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take threasonable and proportionate mitigation measures.
Amendment 861 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessinginform the software application provider concerned and the EU Centre about the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;
Amendment 868 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
Article 6 – paragraph 1 – point c
Amendment 870 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Article 6 – paragraph 1 a (new)
1a. Providers of software applications who have been informed that in relation to their software applications a significant risk of use of the service concerned for the purpose of the solicitation of children has been identified, shall take reasonable and proportionate mitigation measures.
Amendment 890 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service in the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
Amendment 1128 #
Proposal for a regulation
Article 10 – paragraph 1
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that have received a detection order concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
Amendment 1266 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1)courts identified as constituting child sexual abuse material.
Amendment 1294 #
Proposal for a regulation
Chapter II – Section 5
Chapter II – Section 5
Amendment 1332 #
Proposal for a regulation
Article 19 a (new)
Article 19 a (new)
Article19a Respect to Privacy Nothing in this Regulation shall be interpreted as a requirement to 1. break cryptography; 2. scan content on users’ devices; 3. restrict anonymous access to online services and software applications.
Amendment 1698 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1) concerning the online activities of persons suspected of being involved in child sexual abuse and persons disqualified from exercising activities involving children.
Amendment 1701 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
Article 50 – paragraph 1 – subparagraph 2
To that aim, the EU Centre shall compile lists of such technologies, having regard to the requirements of this Regulation and in particular those of Article 10(2) and Article 19a (new).