37 Amendments of Maite PAGAZAURTUNDÚA related to 2022/0155(COD)
Amendment 370 #
Proposal for a regulation
Recital 22
Recital 22
(22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousness of any potential negative consequences for other parties affected. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
Amendment 417 #
Proposal for a regulation
Recital 30
Recital 30
(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection, Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
Amendment 487 #
Proposal for a regulation
Recital 74 a (new)
Recital 74 a (new)
(74a) The Technology Committee could therefore establish a certification for technologies which could be used by online service providers to detect child sexual abuse material on their request.
Amendment 515 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d a (new)
Article 1 – paragraph 1 – subparagraph 2 – point d a (new)
(da) obligations on providers of online search engines and any other artificial intelligence systems to delist or disable specific items of child sexual abuse, or both;
Amendment 555 #
Proposal for a regulation
Article 2 – paragraph 1 – point e a (new)
Article 2 – paragraph 1 – point e a (new)
(ea) “online search engine” means an intermediary service as defined in Article 3, point (j), of Regulation (EU) 2022/2065;
Amendment 556 #
Proposal for a regulation
Article 2 – paragraph 1 – point e b (new)
Article 2 – paragraph 1 – point e b (new)
(eb) ‘intermediary service’ means a service as defined in Article 3, point (g), of Regulation (EU) 2022/2065;
Amendment 557 #
Proposal for a regulation
Article 2 – paragraph 1 – point e c (new)
Article 2 – paragraph 1 – point e c (new)
(ec) ‘artificial intelligence system’ (AI system) means software as defined in Article 3(1) of Regulation (EU) .../... on Artificial Intelligence (Artificial Intelligence Act);
Amendment 569 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv a (new)
Article 2 – paragraph 1 – point f – point iv a (new)
(iva) an online search engine;
Amendment 570 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv b (new)
Article 2 – paragraph 1 – point f – point iv b (new)
(ivb) an artificial intelligence system.
Amendment 581 #
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 178 years;
Amendment 613 #
Proposal for a regulation
Article 3 – paragraph 1
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the risk of use of the service for the purpose of online child sexual abuse., which requires a targeted and tailor-made response;
Amendment 628 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability of functionalities to prevent and address the risk referred to in paragraph 1, including through the following:
Amendment 634 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 2 a (new)
Article 3 – paragraph 2 – point b – indent 2 a (new)
Amendment 650 #
- – Functionalities enabling detection for known child sexual abuse material on upload; – Functionalities preventing uploads from the dark web;
Amendment 688 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 3 a (new)
Article 3 – paragraph 2 – point e – point iii – indent 3 a (new)
- – Enabling users to create usernames that contain a representation about, or imply, the user’s age; – Enabling child users to create usernames that contain location information on child users; – Enabling users to know or infer the location of child users.
Amendment 693 #
Proposal for a regulation
Article 3 – paragraph 2 a (new)
Article 3 – paragraph 2 a (new)
2a. When providers of hosting services and providers of interpersonal communication services put forward age assurance or age verification systems as mitigating measures, they shall meet the following criteria: (a) Protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose; (b) Do not collect data that is not necessary for the purposes of age assurance; (c) Be proportionate to the risks associated to the product or service that presents a risk of misuse for child sexual abuse; (d) Provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
Amendment 734 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to their specific service and the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:
Amendment 741 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
Article 4 – paragraph 1 – point a a (new)
(aa) Designing educational and awareness-raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, including child-appropriate information;
Amendment 775 #
1a. Providers of hosting services and providers of interpersonal communications services shall continue the voluntary use of specific technologies, as mitigation measures, for the processing of personal and other data to the extent strictly necessary to detect, report and remove online child sexual abuse on their services and to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment conducted or updated in accordance with Article 3 and prior authorization from the Coordinating Authority;
Amendment 804 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary and proportionate age verification and age assessment measures to reliably identify childdifferentiate between child users and adult users on their services, enabling them to take the mitigation measures and protect child users. Age assurance or age verification systems as mitigation measure shall be implemented only if they meet the criteria set in Article 3, paragraph 2a of this Regulation.
Amendment 838 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) any mitigation measures taken and those that require prior authorization pursuant to Article 4.
Amendment 892 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service.
Amendment 951 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – introductory part
Article 7 – paragraph 4 – subparagraph 1 – introductory part
The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority or independent administrative authority shall issue the detection order where it considers that the following conditions are met:
Amendment 1162 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
Article 10 – paragraph 3 – point d a (new)
(da) not able to prohibit or make end- to-end encryption impossible.
Amendment 1171 #
Proposal for a regulation
Article 10 – paragraph 4 – point a
Article 10 – paragraph 4 – point a
(a) take all the necessary measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly necessary to use voluntary measures, when authorised, or execute the detection orders addressed to them;
Amendment 1269 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting child sexual abuse material.
Amendment 1340 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1
Article 20 – paragraph 1 – subparagraph 1
Amendment 1363 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
Amendment 1391 #
Proposal for a regulation
Article 23 – paragraph 1
Article 23 – paragraph 1
1. PAs referred to in Article 12 of the Digital Service Act Regulation, providers of relevant information society services shall establish a single point of contact allowing for direct communication, by electronic means, with the Coordinating Authorities, other competent authorities of the Member States, the Commission and the EU Centre, for the application of this Regulation.
Amendment 1541 #
Proposal for a regulation
Article 42 – paragraph 1
Article 42 – paragraph 1
The choice of the location of the seat of the EU Centre shall be The Hague, The Netherlandmade in accordance with the ordinary legislative procedure, based on the following criteria: (a) it shall not affect the EU Centre’s execution of its tasks or the organisation of its governance structure; (b) it shall ensure that the EU Centre is able to recruit the high-qualified and specialised staff it requires to perform the tasks provided by this Regulation; (c) it shall ensure that it can be set up on site upon the entry into force of this Regulation; (d) it shall ensure appropriate accessibility of the location, the existence of adequate education facilities for the children of staff members, appropriate access to the labour market, social security and medical care for both children and spouses; (e) it shall enable close cooperation with EU institutions, bodies and agencies; (f) it shall ensure sustainability and digital security and connectivity with regards to physical and IT infrastructure and working conditions.
Amendment 1580 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) Referring victims to the appropriate national child protection services;
Amendment 1593 #
(6a) support Member States in designing preventive measures, such as awareness-raising campaigns to combat child sexual abuse, with a specific focus on girls and other prevalent demographics, including by: (a) Acting on behalf of victims in liaising with other relevant authorities of the Member States for reparations and all other victim support programmes; (b) Referring victims to the appropriate child protection services, and to pro bono legal support services; (c) Facilitating access to care qualified health support services, including mental health and psychological support;
Amendment 1634 #
Proposal for a regulation
Article 46 – paragraph 2
Article 46 – paragraph 2
2. The EU Centre shall give providers of hosting services, providers of interpersonal communications services and providers of internet access services access to the databases of indicators referred to in Article 44, where and to the extent necessary for them to put in place voluntary measures, when authorised, and execute the detection or blocking orders that they received in accordance with Articles 7 or 16. It shall take measures to ensure that such access remains limited to what is strictly necessary for the period of application of the detection or blocking orders concerned as well as for the execution of the voluntary measures, when authorised, and that such access does not in any way endanger the proper operation of those databases and the accuracy and security of the data contained therein.
Amendment 1711 #
Proposal for a regulation
Article 50 – paragraph 2 – point c
Article 50 – paragraph 2 – point c
(c) information resulting from research or other activities conducted by Member States’ authorities, other Union institutions, bodies, offices and agencies, the competent authorities of third countries, international organisations, research centres, hotlines and civil society organisations.
Amendment 1755 #
Proposal for a regulation
Article 54 – paragraph 1
Article 54 – paragraph 1
1. Where necessary for the performance of its tasks under this Regulation, the EU Centre may cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations acting in the public interest, hotlines and semi-public organisations.
Amendment 1761 #
Proposal for a regulation
Article 54 – paragraph 2 a (new)
Article 54 – paragraph 2 a (new)
2a. The EU Centre shall cooperate with other organisations and bodies carrying out similar functions in other jurisdictions, such as the National Centre for Missing and Exploited Children (‘NCMEC’) and the Canadian Centre for Child Protection, among others, which serve the same purpose of this Regulation, as well as in order to avoid potential duplication of reporting obligations for providers.
Amendment 1778 #
Proposal for a regulation
Article 57 – paragraph 1 – point f
Article 57 – paragraph 1 – point f
(f) appoint the members of the Technology Committee, of the Children's Rights and Survivors Advisory Board and of any other advisory group it may establish;