BETA

41 Amendments of Heléne FRITZON related to 2022/0155(COD)

Amendment 113 #
Proposal for a regulation
Recital 35
(35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted, whom to the vast majority are girls. Victims should therefore have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant and age-appropriate information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available interpersonal communications services in accordance with this Regulation.
2023/05/08
Committee: FEMM
Amendment 133 #
Proposal for a regulation
Recital 67
(67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned. The EU centre shall also provide knowledge, expertise and best practice on preventive measures targeted at abusers.
2023/05/08
Committee: FEMM
Amendment 146 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
(j) ‘child user’ means a natural person who uses a relevant information society service and who is a natural person below the age of 178 years;
2023/05/08
Committee: FEMM
Amendment 162 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point ii
(ii) where the service is used by children, the different age groups of the child users and the risk of solicitation of children in relation to those age groups, as well as the risk of adults using the service for the purpose of solicitation of children;
2023/05/08
Committee: FEMM
Amendment 184 #
Proposal for a regulation
Article 4 – paragraph 2 – point b
(b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk, specific vulnerabilities of children online and offline including age, gender and disability, as well as the provider’s financial and technological capabilities and the number of users;
2023/05/08
Committee: FEMM
Amendment 194 #
Proposal for a regulation
Article 4 – paragraph 5
5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments, trends and evidence reported by law enforcement, hotlines, civil society organisations, EIGE and technology companies, in combating child sexual abuse online and in the manners in which the services covered by those provisions are offered and used.
2023/05/08
Committee: FEMM
Amendment 200 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessing the software applications not intended for their use or adapted to their safety needs in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children;
2023/05/08
Committee: FEMM
Amendment 206 #
Proposal for a regulation
Article 6 – paragraph 4
4. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments, trends and evidence reported by law enforcement, hotlines, civil society organisations, EIGE and technology companies, in combating child sexual abuse online, and to the manners in which the services covered by those provisions are offered and used.
2023/05/08
Committee: FEMM
Amendment 219 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point b
(b) where the draft implementation plan concerns an intended detection order concerning the solicitation of children other than the renewal of a previously issued detection order without any substantive changes, conduct a data protection impact assessmentimpact assessments on data protection, gender, and child rights, and a prior consultation procedure as referred to in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to the measures set out in the implementation plan;
2023/05/08
Committee: FEMM
Amendment 220 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point c
(c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the data protection impact assessments on data protection, gender, and child rights, and in order to take into account the opinion of the data protection authority provided in response to the prior consultation;
2023/05/08
Committee: FEMM
Amendment 288 #
Proposal for a regulation
Article 10 – paragraph 4 – point d
(d) establish and operate an accessible, age-appropriate, gender-sensitive, and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
2023/05/08
Committee: FEMM
Amendment 289 #
Proposal for a regulation
Article 11 – paragraph 1
The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of Articles 7 to 10, having due regard in particular to relevant technological developments, trends and evidence reported by law enforcement, hotlines, civil society organisations, EIGE and technology companies, in combating child sexual abuse online, and the manners in which the services covered by those provisions are offered and used.
2023/05/08
Committee: FEMM
Amendment 295 #
Proposal for a regulation
Article 12 – paragraph 3
3. The provider shall establish and operate an accessible, age-appropriate and user-friendly mechanism with gender and age- appropriate options that allows users to flag to the provider, including anonymously, potential online child sexual abuse on the service.
2023/05/08
Committee: FEMM
Amendment 296 #
Proposal for a regulation
Article 12 – paragraph 3 a (new)
3a. New possible child sexual abuse material reported by a user shall immediately be assessed to determine the probability that the material represent risk or harm to a child. If the potential online child sexual abuse on the service is flagged by a user known to be a child, the provider shall provide the child with essential information on online safety and specialist child support services, such as helplines and hotlines, in addition to the reporting of the material.
2023/05/08
Committee: FEMM
Amendment 320 #
Proposal for a regulation
Article 15 – paragraph 3 – point c a (new)
(ca) if the user is a child, referral to competent national support services and essential information on online safety, in a child-friendly language.
2023/05/08
Committee: FEMM
Amendment 329 #
Proposal for a regulation
Recital 14 a (new)
(14a) Given the severity of these crimes and the long-lasting negative consequences on the victims and the risk of revictimization as a result of the dissemination of known material, new material, as well as activities constituting the solicitation of children, it is essential that this Regulation provides specific obligations for providers of hosting services and providers of interpersonal communication services to prevent, detect, report, remove child sexual abuse material in all their services, including interpersonal communication services, which may also be covered by end-to-end encryption, in light of the prevalence of dissemination of child sexual abuse material, including the solicitation of children, in interpersonal communication services.
2023/07/28
Committee: LIBE
Amendment 385 #
Proposal for a regulation
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Nothing in this Regulation should therefore be interpreted as prohibiting end-to-end encryption or making it impossible. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users, while ensuring the effective detection of online child sexual abuse and the balance of all the fundamental rights at stake.
2023/07/28
Committee: LIBE
Amendment 401 #
Proposal for a regulation
Recital 27 a (new)
(27a) To the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, it should be possible for the Coordinating Authority of establishment to authorise providers to process metadata.
2023/07/28
Committee: LIBE
Amendment 417 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 2
As regards the first subparagraph, point (a), the provider may also preserve the information, including data on gender and age, for the purpose of improving the effectiveness and accuracy of the technologies to detect online child sexual abuse for the execution of a detection order issued to it in accordance with Article 7. However, it shall not store any personal data for that purpose.
2023/05/08
Committee: FEMM
Amendment 446 #
Proposal for a regulation
Article 39 – paragraph 1
1. Coordinating Authorities shall cooperate with each other, any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre and other relevant Union agencies, including Europol, to facilitate the performance of their respective tasks under this Regulation and ensure its effective, efficient and consistent application and enforcement. Coordinating Authorities shall establish systematic practises on the exchange of information and best practices related to the prevention and combating of online child sexual abuse and solicitation of children.
2023/05/08
Committee: FEMM
Amendment 447 #
Proposal for a regulation
Article 40 – paragraph 2
2. The EU Centre shall contribute to the achievement of the objective of this Regulation by supporting and facilitating the implementation of its provisions concerning the detection, reporting, removal or disabling of access to, and blocking of online child sexual abuse and gather and share information, gender and age-disggregated statistics, and expertise and facilitate cooperation and sharing of best practices between relevant public and private parties in connection to the prevention and combating of child sexual abuse, in particular online.
2023/05/08
Committee: FEMM
Amendment 474 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c a (new)
(ca) Establish mechanisms to listen to and incorporate the views of children in its work, in accordance with the UNCRC, the Directive 2012/29/EU and the Charter of Fundamental Rights of the European Union.
2023/05/08
Committee: FEMM
Amendment 574 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
(ia) "adult" means any natural person above the age of 18 years;
2023/07/28
Committee: LIBE
Amendment 583 #
Proposal for a regulation
Article 2 – paragraph 1 – point j a (new)
(ja) "adult user" means a natural person who uses a relevant information society service and who is a natural person above the age of 18 years;
2023/07/28
Committee: LIBE
Amendment 694 #
Proposal for a regulation
Article 3 – paragraph 2 a (new)
2a. When providers of hosting services and providers of interpersonal communication services put forward age assurance or age verification system as a mitigation measure, they shell meet the following criteria: a) Protect the privacy of users and do not disclose data gathered for the purposes of age assurance for any other purpose; b) Do not collect data that is not necessary for the purpose of age assurance; c) Be proportionate to the risks associated to the product or service that presents a risk of misuse for child sexual abuse; d) Provide appropriate remedies and redress mechanisms for users whose age is wrongly identified.
2023/07/28
Committee: LIBE
Amendment 742 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
(aa) Designing educational and awarness-raising campaigns aimed at informing and alerting users about the risks of online child sexual abuse, including child-appropriate information;
2023/07/28
Committee: LIBE
Amendment 769 #
2023/07/28
Committee: LIBE
Amendment 776 #
Proposal for a regulation
Article 4 – paragraph 1 a (new)
1a. Providers of hosting services and providers of interpersonal communications services shall continue the voluntary use of specific technologies, as mitigation measures, for the processing of personal and other data to the extent strictly necessary to detect, report and remove online sexual abuse on their services and to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment conducted or updated in accordance with Article 3 and prior authorization from the Coordinating Authority;
2023/07/28
Committee: LIBE
Amendment 780 #
Proposal for a regulation
Article 4 – paragraph 1 b (new)
1b. The Coordinating Authority shall decide whether to proceed according to paragraph 1a no later than three months from the provider´s request.
2023/07/28
Committee: LIBE
Amendment 805 #
Proposal for a regulation
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify childdifferenciate between child and adult users on their services, enabling them to take the mitigation measures. Age assurances or age verification systems as mitigation measures shall be implemented only if they meet the criteria set in Article 3, paragraph 2a of this Regulation.
2023/07/28
Committee: LIBE
Amendment 826 #
Proposal for a regulation
Article 4 a (new)
Article4a Legal basis for the risk mitigation through metadata processing 1. On the basis of the risk assessment submitted and, where applicable, further information, the Coordinating Authority of establishment shall have the power to authorise or require a provider of hosting services or a provider of interpersonal communications services to process metadata to the extent strictly necessary and proportionate to mitigate the risk of misuse of their services for the purpose of online child sexual abuse, as a mitigation measure in accordance with Article 4. When assessing whether to request the processing of metadata, the Coordinating Authority shall take into account any interference with the rights to privacy and data protection of the users of the service that such a processing entails and determine whether, in that case, the processing of metadata would be effective in mitigating the risk of use of the service for the purpose of child sexual abuse, and that it is strictly necessary and proportionate. 2. If they process metadata as a risk mitigation measure, providers shall inform their users of such processing in their terms and conditions, including information on the possibility to submit complaints to the competent DPA concerning the relevant processing, in accordance with Regulation (EU) 2016/679, and on the avenues for judicial redress.
2023/07/28
Committee: LIBE
Amendment 925 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – introductory part
Where, having regard to the comments of the provider and the opinion of the EU Centre, and in particular taking into account the assessment of the EU Centre´s Technical Committee as referred to in Article 66(6)(a NEW), that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall re-submit the draft request, adjusted where appropriate, to the provider. In that case, the provider shall do all of the following, within a reasonable time period set by that Coordinating Authority:
2023/07/28
Committee: LIBE
Amendment 928 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point a
(a) draft an implementation plan setting out the measures it envisages taking to execute the intended detection order, including detailed information regarding the envisaged technologies and safeguards; the implementation plan shall explicitly set out the specific measures that the provider intends to take to counter act potential security risk that might be linked to the execution of the detection order on its services. The provider may consult the EU Centre, and in particular its Technology Committee, to obtain support in identifying appropriate measures in this respect;
2023/07/28
Committee: LIBE
Amendment 1025 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 2
To that aim, they shall take into account all relevant parameters, including: (i) the availability of sufficiently reliable detection technologies in that they can be deployed without undermining the security of the service in question and they limit to the maximum extent possible the rate of errors regarding the detection and; (ii) their suitability and effectiveness of the available technologies for achieving the objectives of this Regulation, as well as; (iii) the impact of the measures on the rights of the users affected, and require the taking ofthereby ensuring that detection orders are only requested and issued when sufficiently reliable technologies in accordance with point (i) are available and that the least intrusive measures are chosen, in accordance with Article 10, from among several equally effective measures.
2023/07/28
Committee: LIBE
Amendment 1031 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 – point a
(a) where the information gathered in the risk assessment process indicates that risk is limited to an identifiable part or component of a service, where possible without prejudice to the effectiveness of the measure, the required measures are only applied in respect of that part or component;
2023/07/28
Committee: LIBE
Amendment 1049 #
Proposal for a regulation
Article 7 a (new)
Article7a Safeguards on encrypted services For the scope of this Regulation and for the the sole purpose to prevent and combat child sexual abuse, providers of interpersonal communications services shall be subjected to obligations to prevent, detect, report and remove online child sexual abuse on all their services, which may include as well those covered by end-to-end encyption, when there is a significant risk that their specific service is misused for online child sexual abuse, including for the purpose of the solicitation of children, pursuant to the risk assessment established in Article 3 of this Regulation. The technologies deployed to execute the detection order pursuant to Article 7 of this Regulation shall never prohibit encryption or make it impossible and shall only be deployed after a prior authorization by the Coordinating Authority, in consultation with the competent data protection authority, and be subjected to constant monitoring and auditing by the competent data protection authority to verify their compliance with Union law.
2023/07/28
Committee: LIBE
Amendment 1136 #
Proposal for a regulation
Article 10 – paragraph 2
2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection order and, where needed, of adopting the security measures imposed by Article 7(3)(a). The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.
2023/07/28
Committee: LIBE
Amendment 1594 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 a (new)
(6a) support Member States in designing preventive measures, such as awarness-raising campaigns to combat child sexual abuse, with a specific focus on girls and other prevalent demographics, including by: a) acting on behalf of victims in liaising with other relevant authorities of the Member States for reparations and all other victim support programmes; b) referring victims to the appropriate child protection services, and to pro bono legal support services.
2023/07/28
Committee: LIBE
Amendment 1618 #
Proposal for a regulation
Article 44 – paragraph 4 a (new)
4a. The EU Centre shall ensure through all technical means available that the database of indicators is secure and cannot be alterated by providers, users and any other actor at the moment of its deployment for the purpose of detection.
2023/07/28
Committee: LIBE
Amendment 1697 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The EU Centre shall make available: (i) technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1). (ii) technologies that providers of end-to- end encrypted electronic communication services may acquire, install and operate, free of charge, where relevant subject to reasonable licencing conditions, to adopt the security measures imposed on them by Article 7(3)(a).
2023/07/28
Committee: LIBE