Activities of Vincenzo SOFO related to 2022/0155(COD)
Shadow reports (1)
REPORT on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse
Amendments (33)
Amendment 289 #
Proposal for a regulation
Recital 2
Recital 2
(2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services. Considering the importance of the right to privacy, including the protection of personal data, as guaranteed by the Charter of Fundamental Rights, nothing in this regulation should be interpreted in a way that would enable future broad based mass surveillance.
Amendment 323 #
Proposal for a regulation
Recital 13 a (new)
Recital 13 a (new)
(13a) In order to protect children, this Regulation should take into account the concerning hypersexualized use of children's images in adverstising campaigns and the increasing spread of cultural pseudo-pedophilia also fuelled by fundraising campaigns.
Amendment 335 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and, parental control tools and functionalities enabling self-reporting by children, their parents or legal guardians, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 386 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of childrenNothing in this Regulation should therefore be interpreted as prohibiting end-to-end encryption or making it impossible or leading to any form of general monitoring. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users. Under no circumstances should this Regulation be interpreted or used as an instrument of mass surveillance and monitoring.
Amendment 404 #
Proposal for a regulation
Recital 28
Recital 28
(28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliable, as well as to identify false positivesdo not produce too many false positives identifying the reasons for their appearance, and avoid to the extent erroneous reporting to the EU Centre, providers should ensure stringent human oversight and, where necessary and required to uphold the highest possible standards, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular and independent assessment of the rates of false negatives and positives generated by the technologies, based on an analysis of anonymised representative data samples. In particular where the detection of the solicitation of children in publicly available interpersonal communications is concerned, service providers should ensure regular, specific and detailed human oversight and human verification of conversations identified by the technologies as involving potential solicitation of children.
Amendment 411 #
Proposal for a regulation
Recital 29
Recital 29
(29) Providers of hosting services and providers of publicly available interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that they may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, they should be required to report on potential online child sexual abuse on their services, whenever they become aware of it, that is, when there are reasonable grounds to believe that a particular activity may constitute online child sexual abuse. Where such reasonable grounds exist, doubts about the potential victim’s age should not prevent those providers from submitting reports. In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Such awareness could, for example, be obtained through the execution of detection orders, information flagged by users or organisations acting in the public interest against child sexual abuse, or activities conducted on the providers’ own initiativThe providers can obtain such actual knowledge or awareness, inter alia, through its own initiative investigations, as well as through information flagged or notified by users, self-reported by victims or organizations, such as hotlines, acting in the public interest against child sexual abuse. Those providers should report a minimum of information, as specified in this Regulation, for competent law enforcement authorities to be able to assess whether to initiate an investigation, where relevant, and should ensure that the reports are as complete as possible before submitting them so that competent law enforcement authorities can focus on reports that are most likely to lead to recovery of a child, the arrest of an offender, or both.
Amendment 413 #
(30) To ensure that online child sexual abuse material is removed as swiftly as possible after its detection,. Any removal or disabling of access should respect the fundamental rights of the users of the service, including the right to freedom of expression and of information. Coordinating Authorities of establishment should have the power to request competent judicial authorities or independent administrative authorities to issue a removal order addressed to providers of hosting services. As removal or disabling of access may affect the right of users who have provided the material concerned, providers should inform such users of the reasons for the removal, to enable them to exercise their right of redress, subject to exceptions needed to avoid interfering with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
Amendment 497 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to prevent and address the misuse of relevant information society services for online child sexual abuse in the internal market.
Amendment 516 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d a (new)
Article 1 – paragraph 1 – subparagraph 2 – point d a (new)
(da) obligations on providers of online search engines to delist websites which were determined to host child sexual abuse material;
Amendment 523 #
Proposal for a regulation
Article 1 – paragraph 3 – point b a (new)
Article 1 – paragraph 3 – point b a (new)
(ba) Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online;
Amendment 528 #
Proposal for a regulation
Article 1 – paragraph 3 – point d a (new)
Article 1 – paragraph 3 – point d a (new)
(da) Directive (EU) 2022/2555 of the European Parliament and the Council of 14 December 2022 on measures for high common level of cybercecurity across the Union, amending Regulation (EU) No 910/2014 and Directive (EU) 2018/1972 and repealing Directive (EU) 2016/1148 (NIS 2 Directive);
Amendment 531 #
Proposal for a regulation
Article 1 – paragraph 3 a (new)
Article 1 – paragraph 3 a (new)
3a. This regulation shall not have the effect of modifying the obligation to respect the rights, freedom and principles referred to in Article 6 TEU and shall apply without prejudice to fundamental principles relating to the right to private life and family life and to freedom of expression and information;
Amendment 535 #
Proposal for a regulation
Article 1 – paragraph 3 b (new)
Article 1 – paragraph 3 b (new)
3b. Nothing in this Regulation shall be interpreted as prohibiting or weakening end-to-end encryption.
Amendment 575 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
Article 2 – paragraph 1 – point j
Amendment 576 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
Article 2 – paragraph 1 – point j
Amendment 606 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
Article 2 – paragraph 1 – point w a (new)
(wa) "online search engine" means an intermediary service as defined in Article 3, point (j), of Regulation (EU) 2022/2065;
Amendment 607 #
Proposal for a regulation
Article 2 – paragraph 1 – point w b (new)
Article 2 – paragraph 1 – point w b (new)
(wb) 'hotline' means an organization recognized by its Member State of establishment, which provides either a reporting channel provided by law enforcement authorities, or service for receiving anonymous complaints from victims and the public about alleged child sexual abuse online.
Amendment 631 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 1 a (new)
Article 3 – paragraph 2 – point b – indent 1 a (new)
- the availability to employ appropriate technical measures - such as parental control tools - to prevent underage access and exposure to inappropriate content or services;
Amendment 643 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag or notify online child sexual abuse to the provider through tools that are easily accessible and age-appropriate, including already available anonymous reporting channels as provided by Directive (EU) 2019/1937;
Amendment 652 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
Article 3 – paragraph 2 – point b – indent 4 a (new)
- funcionalities enabling self- reporting by children, their parents or legal guardians.
Amendment 740 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
Article 4 – paragraph 1 – point a a (new)
(aa) adapting the design, features and functions of their service in order to ensure the highest level of privacy, safety and security by design and by default, in particular, to protect children;
Amendment 746 #
Proposal for a regulation
Article 4 – paragraph 1 – point a b (new)
Article 4 – paragraph 1 – point a b (new)
(ab) emplying appropriate age measurments - such as parental control tools, to prevent underage access and exposure to inappropriate content or services;
Amendment 766 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
Article 4 – paragraph 1 – point c a (new)
(ca) enabling users to flag or notify online child sexual abuse to the provider through tools that are easily accessible and age-appropriate, including already anonymous reporting channels;
Amendment 771 #
Proposal for a regulation
Article 4 – paragraph 1 – point c b (new)
Article 4 – paragraph 1 – point c b (new)
(cb) enabling safe self-reporting capabilities for children, their parents or legal guardians.
Amendment 860 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;exploting children or where the developer of the software application has informed the software application store that its terms of use do not allow child users, the software application has an appropriate age rating model in place, or the developer of the software application has requested the software application store not to allow child users to download its software applications.
Amendment 905 #
Proposal for a regulation
Article 7 – paragraph 2 a (new)
Article 7 – paragraph 2 a (new)
2a. The grounds for issuing the order shall outweight the negative consequences for the rights and legitimate iterests of all the parties concerned, having regard in particular to the need to endure a fair balance between the fundamental rights of those parties. The order shall be a measure of last resort and shall be issued on the basis of a case-by-case analysis.
Amendment 1160 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
Article 10 – paragraph 3 – point d a (new)
(da) not able to weaken end-to end encryption and to lead to a general monitoring of private comunications.
Amendment 1188 #
Proposal for a regulation
Article 10 – paragraph 4 – point f a (new)
Article 10 – paragraph 4 – point f a (new)
(fa) ensure privacy without hampering the integrity of encryption and without leading to a general monitoring of private communications.
Amendment 1230 #
Proposal for a regulation
Article 12 – paragraph 3
Article 12 – paragraph 3
3. The provider shall establish and operate an accessible, age-appropriate, child-friendly and user-friendly mechanism, including self-reporting tools, that allows users to flag or notify to the provider potential online child sexual abuse on the services. Those mechanisms shall allow for anonymous reporting already available through anonymous reporting channels as defined by Directive (EU) 2019/1937.
Amendment 1259 #
Proposal for a regulation
Article 13 – paragraph 1 – point g a (new)
Article 13 – paragraph 1 – point g a (new)
(ga) whether the provider considers that the report involves and imminent threat to the life or safety of a child or requires urgent action;
Amendment 1395 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 1
Article 25 – paragraph 2 – subparagraph 1
Where Member States shall, by the date referred to in paragraph 1, designatedesignate more than one competent authority, it shall appoint one of those competent authorities as their Coordinating Authority for child sexual abuse issues (‘Coordinating Authority’). Where they designate only one competent authority, that competent authority shall be the Coordinating Authority.
Amendment 1423 #
Proposal for a regulation
Article 26 – paragraph 2 – point c
Article 26 – paragraph 2 – point c
(c) are free from any undue external influence, whether direct or indirect in line with their national legislation;
Amendment 1743 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 1
Article 53 – paragraph 2 – subparagraph 1
Europol and the EU Centre shall provide each other with the fullest possible access to relevant information and information systems, where necessary for the performance of their respective tasks and in accordance with the acts of Union law regulating such access. Any access to personal data processed in Europol's information systems, where deemed stricly necessary for the performance of the EU Centre's tasks, shall be granted only case- by-case basis, upon submission of an explicit request, which indicates the specific purpose and justification. Europol shall be required to diligentely assess those requests and only transmit personal data to the EU Centre where strictly necessary and proprotionate to the required purpose.