22 Amendments of Pietro BARTOLO related to 2022/0155(COD)
Amendment 329 #
Proposal for a regulation
Recital 14 a (new)
Recital 14 a (new)
(14a) Given the severity of these crimes and the long-lasting negative consequences on the victims and the risk of revictimization as a result of the dissemination of known material, new material, as well as activities constituting the solicitation of children, it is essential that this Regulation provides specific obligations for providers of hosting services and providers of interpersonal communication services to prevent, detect, report, remove child sexual abuse material in all their services, including interpersonal communication services, which may also be covered by end-to-end encryption, in light of the prevalence of dissemination of child sexual abuse material, including the solicitation of children, in interpersonal communication services.
Amendment 385 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Nothing in this Regulation should therefore be interpreted as prohibiting end-to-end encryption or making it impossible. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users, while ensuring the effective detection of online child sexual abuse and the balance of all the fundamental rights at stake.
Amendment 401 #
Proposal for a regulation
Recital 27 a (new)
Recital 27 a (new)
(27a) To the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, it should be possible for the Coordinating Authority of establishment to authorise providers to process metadata.
Amendment 574 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
Article 2 – paragraph 1 – point i a (new)
(ia) "adult" means any natural person above the age of 18 years;
Amendment 583 #
Proposal for a regulation
Article 2 – paragraph 1 – point j a (new)
Article 2 – paragraph 1 – point j a (new)
(ja) "adult user" means a natural person who uses a relevant information society service and who is a natural person above the age of 18 years;
Amendment 694 #
Proposal for a regulation
Article 3 – paragraph 2 a (new)
Article 3 – paragraph 2 a (new)
2a. When providers of hosting services and providers of interpersonal communication services put forward age assurance or age verification system as a mitigation measure, they shell meet the following criteria: a) Protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose; b) Do not collect data that is not necessary for the purpose of age assurance; c) Be proportionate to the risks associated to the product or service that presents a risk of misuse for child sexual abuse; d) Provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
Amendment 742 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
Article 4 – paragraph 1 – point a a (new)
(aa) Designing educational and awarness-raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, including child-appropriate information;
Amendment 769 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
Article 4 – paragraph 1 – point c a (new)
(ca) processing metadata, in accordance with Article 4a
Amendment 776 #
Proposal for a regulation
Article 4 – paragraph 1 a (new)
Article 4 – paragraph 1 a (new)
1a. Providers of hosting services and providers of interpersonal communications services shall continue the voluntary use of specific technologies, as mitigation measures, for the processing of personal and other data to the extent strictly necessary to detect, report and remove online sexual abuse on their services and to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment conducted or updated in accordance with Article 3 and prior authorization from the Coordinating Authority;
Amendment 780 #
Proposal for a regulation
Article 4 – paragraph 1 b (new)
Article 4 – paragraph 1 b (new)
1b. The Coordinating Authority shall decide whether to proceed according to paragraph 1a no later than three months from the provider´s request.
Amendment 805 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify childdifferenciate between child and adult users on their services, enabling them to take the mitigation measures. Age assurances or age verification systems as mitigation measures shall be implemented only if they meet the criteria set in Article 3, paragraph 2a of this Regulation.
Amendment 826 #
Proposal for a regulation
Article 4 a (new)
Article 4 a (new)
Article4a Legal basis for the risk mitigation through metadata processing 1. On the basis of the risk assessment submitted and, where applicable, further information, the Coordinating Authority of establishment shall have the power to authorise or require a provider of hosting services or a provider of interpersonal communications services to process metadata to the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, as a mitigation measure in accordance with Article 4. When assessing whether to request the processing of metadata, the Coordinating Authority shall take into account any interference with the rights to privacy and data protection of the users of the service that such a processing entails and determine whether, in that case, the processing of metadata would be effective in mitigating the risk of use of the service for the purpose of child sexual abuse, and that it is strictly necessary and proportionate. 2. If they process metadata as a risk mitigation measure, providers shall inform their users of such processing in their terms and conditions, including information on the possibility to submit complaints to the competent DPA concerning the relevant processing, in accordance with Regulation (EU) 2016/679, and on the avenues for judicial redress.
Amendment 925 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – introductory part
Article 7 – paragraph 3 – subparagraph 2 – introductory part
Where, having regard to the comments of the provider and the opinion of the EU Centre, and in particular taking into account the assessment of the EU Centre´s Technical Committee as referred to in Article 66(6)(a NEW), that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall re-submit the draft request, adjusted where appropriate, to the provider. In that case, the provider shall do all of the following, within a reasonable time period set by that Coordinating Authority:
Amendment 928 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point a
Article 7 – paragraph 3 – subparagraph 2 – point a
(a) draft an implementation plan setting out the measures it envisages taking to execute the intended detection order, including detailed information regarding the envisaged technologies and safeguards; the implementation plan shall explicitly set out the specific measures that the provider intends to take to counter act potential security risk that might be linked to the execution of the detection order on its services. The provider may consult the EU Centre, and in particular its Technology Committee, to obtain support in identifying appropriate measures in this respect;
Amendment 1025 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 2
Article 7 – paragraph 8 – subparagraph 2
To that aim, they shall take into account all relevant parameters, including: (i) the availability of sufficiently reliable detection technologies in that they can be deployed without undermining the security of the service in question and they limit to the maximum extent possible the rate of errors regarding the detection and; (ii) their suitability and effectiveness of the available technologies for achieving the objectives of this Regulation, as well as; (iii) the impact of the measures on the rights of the users affected, and require the taking ofthereby ensuring that detection orders are only requested and issued when sufficiently reliable technologies in accordance with point (i) are available and that the least intrusive measures are chosen, in accordance with Article 10, from among several equally effective measures.
Amendment 1031 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 – point a
Article 7 – paragraph 8 – subparagraph 3 – point a
(a) where the information gathered in the risk assessment process indicates that risk is limited to an identifiable part or component of a service, where possible without prejudice to the effectiveness of the measure, the required measures are only applied in respect of that part or component;
Amendment 1049 #
Proposal for a regulation
Article 7 a (new)
Article 7 a (new)
Article7a Safeguards on encrypted services For the scope of this Regulation and for the the sole purpose to prevent and combat child sexual abuse, providers of interpersonal communications services shall be subjected to obligations to prevent, detect, report and remove online child sexual abuse on all their services, which may include as well those covered by end-to-end encyption, when there is a significant risk that their specific service is misused for online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment established in Article 3 of this Regulation. The technologies deployed to execute the detection order pursuant to Article 7 of this Regulation shall never prohibit encryption or make it impossible and shall only be deployed after a prior authorization by the Coordinating Authority, in consultation with the competent data protection authority, and be subjected to constant monitoring and auditing by the competent data protection authority to verify their compliance with Union law.
Amendment 1136 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection order and, where needed, of adopting the security measures imposed by Article 7(3)(a). The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.
Amendment 1161 #
(da) not able to prohibit or make end- to-end encryption impossible.
Amendment 1594 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 a (new)
Article 43 – paragraph 1 – point 6 a (new)
(6a) support Member States in designing preventive measures, such as awarness-raising campaigns to combat child sexual abuse, with a specific focus on girls and other prevalent demographics, including by: a) acting on behalf of victims in liaising with other relevant authorities of the Member States for reparations and all other victim support programmes; b) referring victims to the appropriate child protection services, and to pro bono legal support services.
Amendment 1618 #
Proposal for a regulation
Article 44 – paragraph 4 a (new)
Article 44 – paragraph 4 a (new)
4a. The EU Centre shall ensure through all technical means available that the database of indicators is secure and cannot be alterated by providers, users and any other actor at the moment of its deployment for the purpose of detection.
Amendment 1697 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available: (i) technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1). (ii) technologies that providers of end-to- end encrypted electronic communication services may acquire, install and operate, free of charge, where relevant subject to reasonable licencing conditions, to adopt the security measures imposed on them by Article 7(3)(a).