204 Amendments of Alex AGIUS SALIBA related to 2022/0155(COD)
Amendment 337 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 357 #
Proposal for a regulation
Recital 20
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orderwarrants. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place innumber-independent publicly available interpersonal communications services, it should only be possible to address detection orderwarrants to providers of such services.
Amendment 358 #
Proposal for a regulation
Recital 20 a (new)
Recital 20 a (new)
(20a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any weakening of encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be prohibiting or weakening end-to-end encryption or be interpreted in that way.
Amendment 359 #
Proposal for a regulation
Recital 20 b (new)
Recital 20 b (new)
(20b) The use of end-to-end encryption should be promoted and, where necessary, be mandatory in accordance with the principles of security and privacy by design. Member States should not impose any obligation on encryption providers, on providers of electronic communications services or on any other organisations, at any level of the supply chain, that would result in the weakening of the security of their networks and services, such as the creation or facilitation of backdoors or any other functionality allowing disclosure of communications content to third parties.
Amendment 360 #
Proposal for a regulation
Recital 20 c (new)
Recital 20 c (new)
(20c) The act of breaking encryption refers to the act of defeating or bypassing the encryption protocol used to secure a communication. Any access by any third- party that was not meant to access, read or edit the content of that communication that was supposed to be private and secure should be considered as undermining encryption.
Amendment 361 #
Proposal for a regulation
Recital 20 d (new)
Recital 20 d (new)
(20d) The technologies used for the purpose of executing detection warrants should be in accordance with the state of the art in the industry and are the least privacy-intrusive, including with regard to the principle of data protection by design and by default pursuant to Regulation (EU) 2016/679.
Amendment 364 #
Proposal for a regulation
Recital 21
Recital 21
(21) Furthermore, as parts of those limits and safeguards, detection orderwarrants should only be issued after a diligent and objective assessment leading to the finding of a significant risk of theby a judicial authority and only with the purpose to detect known online child sexual abuse material related to a specific serdevice concerned being misused for a given type of online child sexual abuse covered by this Regulationor user account, where there is a reasonable suspicion such content is stored on that device or in that user account. One of the main elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection orderexistence of evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse.
Amendment 368 #
Proposal for a regulation
Recital 22
Recital 22
(22) However, the finding of such a significant riskexistence of evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse should in itself be insufficient to justify the issuance of a detection orderwarrant, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orderwarrants can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousnactual or potential implications for the rights and legitimate interests of any potential negative consequences for other parties affecll parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Chartedr. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
Amendment 495 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal marketorder to contribute to the proper functioning of the internal market and to create a safe, predictable and trusted online environment where fundamental rights enshrined in the Charter are effectively protected.
Amendment 502 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of number- independent interpersonal communication services to detect and report online child sexual abuse;
Amendment 508 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point c
Article 1 – paragraph 1 – subparagraph 2 – point c
(c) obligations on providers of hosting services to remove or disable access to child sexual abuse material on their services;
Amendment 511 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d
Article 1 – paragraph 1 – subparagraph 2 – point d
Amendment 517 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point e
Article 1 – paragraph 1 – subparagraph 2 – point e
(e) rules on the implementation and enforcement of this Regulation, including as regards the designation and functioning of the competent authorities of the Member States, the EU Centre on Child Sexual Abuse established in Article 40 (‘EU Centre’) and cooperation and transparency.;
Amendment 518 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point e a (new)
Article 1 – paragraph 1 – subparagraph 2 – point e a (new)
(ea) rules on the designation, functioning, cooperation, transparency and powers of the EU Centre on Child Sexual Abuse established in Article 40 (‘EU Centre’);
Amendment 524 #
Proposal for a regulation
Article 1 – paragraph 3 – point d
Article 1 – paragraph 3 – point d
(d) Regulation (EU) 2016/679, Directive 2016/680, Regulation (EU) 2018/1725, and, subject to paragraph 4 of this Article, Directive 2002/58/EC.
Amendment 529 #
Proposal for a regulation
Article 1 – paragraph 3 – point d a (new)
Article 1 – paragraph 3 – point d a (new)
(da) Regulation (EU) …/… [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act);
Amendment 532 #
Proposal for a regulation
Article 1 – paragraph 3 a (new)
Article 1 – paragraph 3 a (new)
3a. This Regulation shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption, or be interpreted in that way.
Amendment 534 #
Proposal for a regulation
Article 1 – paragraph 3 b (new)
Article 1 – paragraph 3 b (new)
3b. This Regulation shall not undermine the prohibition of general monitoring under Union law or introduce general data retention obligations, or be interpreted in that way.
Amendment 539 #
Proposal for a regulation
Article 1 – paragraph 4
Article 1 – paragraph 4
4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC insofar as necessary for the execution of the detection orderswith the sole objective of enabling a provider of hosting services, a provider of number-independent interpersonal communications services or a provider of an artifical intelligence system to use specific technologies for the processing of personal data to the extent strictly necessary to detect and report online child sexual abuse and remove child sexual abuse material on their services, following a detection warrant issued in accordance with Section 2 of Chapter 1 of this Regulation.
Amendment 542 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
Article 1 – paragraph 4 a (new)
4a. This Regulation does not apply to audio communications.
Amendment 549 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
Article 2 – paragraph 1 – point b a (new)
(ba) ‘number-independent interpersonal communications service’ means a publicly available service as defined in Article 2, point 7, of Directive (EU) 2018/1972;
Amendment 554 #
Proposal for a regulation
Article 2 – paragraph 1 – point e a (new)
Article 2 – paragraph 1 – point e a (new)
(ea) ‘artificial intelligence system’ means software as defined in Article 3(1) of Regulation (EU) …/… [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act);
Amendment 560 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point ii
Article 2 – paragraph 1 – point f – point ii
(ii) an number-independent interpersonal communications service;
Amendment 563 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iii
Article 2 – paragraph 1 – point f – point iii
Amendment 568 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv a (new)
Article 2 – paragraph 1 – point f – point iv a (new)
(iva) an artificial intelligence system;
Amendment 577 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
Article 2 – paragraph 1 – point j
Amendment 595 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) ‘child survivor’ means a person as defined in Article 2(1) point (a) of Directive 2011/93/EU who is below 18 years of age and suffered child sexual abuse offences;
Amendment 597 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
Article 2 – paragraph 1 – point q b (new)
(qb) 'survivor' means a person as defined in Article 2(1) point (a) of Directive 2011/93/EU who suffered child sexual abuse offences;
Amendment 599 #
Proposal for a regulation
Article 2 – paragraph 1 – point s
Article 2 – paragraph 1 – point s
(s) ‘content data’ means data as defined in Article 2, point 10, of Regulation (EU) … [on European Production and Preservation Orders for electronic evidence in criminal matters (…/… e-evidence Regulation)]videos and images in a digital format;
Amendment 605 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
Article 2 – paragraph 1 – point w a (new)
(wa) ‘hotline’ means an organisation officially recognised by a Member State, other than the reporting channels provided by law enforcement authorities, for receiving anonymous complaints from victims and the public about alleged child sexual abuse;
Amendment 608 #
Proposal for a regulation
Article -3 (new)
Article -3 (new)
Article-3 Protection of fundamental human rights and confidentiality in communications 1. Nothing in this Regulation shall prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way. 2. Nothing in this Regulation shall undermine the prohibition of general monitoring under Union law or introduce general data retention obligations.
Amendment 610 #
Proposal for a regulation
Article 3 – paragraph 1
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such any serious systemic risk stemming from the functioning and use of their services for the purpose of online child sexual abuse. That risk assessment shall be specific to the services that they offer,ey offer and proportionate to the serious systemic risk considering its severity and probability. To this end, providers subject to an obligation to conduct a risk assessment under Regulation (EU) 2022/2065 may draw on that risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse.
Amendment 618 #
Proposal for a regulation
Article 3 – paragraph 1 a (new)
Article 3 – paragraph 1 a (new)
1a. Without prejudice to Regulation (EU) 2022/2065, when conducting the risk assessment, providers of hosting services and providers of interpersonal communications services shall respect and avoid any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity, respect for private and family life, the protection of personal data, freedom of expression and information, including the freedom and pluralism of the media, the prohibition of discrimination, the rights of the child and consumer protection, as enshrined in Articles 1, 7, 8, 11, 21, 24 and 38 of the Charter respectively.
Amendment 622 #
Proposal for a regulation
Article 3 – paragraph 2 – point a
Article 3 – paragraph 2 – point a
(a) any previouslyserious systemic risks and identified instances of use of its services for the purpose of online child sexual abuse;
Amendment 636 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
Article 3 – paragraph 2 – point b – indent 3
Amendment 645 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily recognisable, accessible and, age-appropriate and child- and user friendly, including anonymous reporting channels;
Amendment 649 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
Article 3 – paragraph 2 – point b – indent 4 a (new)
- systems and mechanisms that provide child- and user-friendly resources to ensure that children can seek help swiftly, including information on how to contact national child protection organisations or national law enforcement.
Amendment 661 #
Proposal for a regulation
Article 3 – paragraph 2 – point d
Article 3 – paragraph 2 – point d
(d) the manner in which the provider designed and operates the service, including the business model, governance, type of users targeted, and relevant systems and processes, and the impact thereof on that risk;
Amendment 665 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point i
Article 3 – paragraph 2 – point e – point i
(i) the extent to which the service is used or is likely to be used bydirectly targeting children;
Amendment 670 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point ii
Article 3 – paragraph 2 – point e – point ii
(ii) where the service is used bydirectly targeting children, the different age groups of the child users and the risk of solicitation of children in relation to those age groupsren the service is targeting;
Amendment 674 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – introductory part
Article 3 – paragraph 2 – point e – point iii – introductory part
(iii) the availability of functionalities creating or reinforcing the serious systemic risk of solicitation of children, including the following functionalities:
Amendment 676 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 1
Article 3 – paragraph 2 – point e – point iii – indent 1
– enabling users to search for other users and, in particular, for adult users to search for child users, in particular on services directly targeting children;
Amendment 679 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 2
Article 3 – paragraph 2 – point e – point iii – indent 2
– enabling users to establish unsolicited contact with other users directly, in particular through private communicationsand for users to engage and connect with children, in particular on services directly targeting children;
Amendment 686 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 3
Article 3 – paragraph 2 – point e – point iii – indent 3
– enabling users to share images or videos with other users, in particular through private communications., in particular on services directly targeting children;
Amendment 698 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 1 a (new)
Article 3 – paragraph 3 – subparagraph 1 a (new)
Neither this request nor its subsequent analysis that the EU Centre may perform shall exempt the provider from its obligation to conduct the risk assessment in accordance with paragraphs 1 and 2 of this Article and to comply with other obligations set out in this Regulation.
Amendment 714 #
Proposal for a regulation
Article 3 – paragraph 5
Article 3 – paragraph 5
Amendment 718 #
Proposal for a regulation
Article 3 – paragraph 6
Article 3 – paragraph 6
6. The Commission, in cooperation with Coordinating Authorities, the European Data Protection Board, the Fundamental Rights Agency and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.
Amendment 724 #
Proposal for a regulation
Article 4 – title
Article 4 – title
4 RSafety-by-design and risk mitigation
Amendment 726 #
Proposal for a regulation
Article 4 – paragraph -1 (new)
Article 4 – paragraph -1 (new)
-1. Providers of hosting services and providers of interpersonal communications services shall have mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be online child sexual abuse.This obligation shall not be interpreted as an obligation of general monitoring or generalised data retention. Such mechanisms shall be easy to access, child-friendly, and shall allow for the submission of notices by electronic means. [By 6 months after entry into force] the Commission shall adopt a delegated act laying down design requirements for a uniform identifiable notification mechanism as referred to in this Article, including on the design of a uniform, easily recognisable, icon in the user interface. Providers of hosting services and providers of interpersonal communications services targeting children may implement the design requirements specified in the delegated act referred to in this paragraph.
Amendment 731 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measuresput in place reasonable, proportionate and targeted mitigation measures, tailored to their services and the serious systemic risk identified pursuant to Article 3, with the aim of mitigating that risk. Such measures shall never entail a general monitoring obligation or generalised data retention obligation and shall include some or all of the following:
Amendment 736 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
Article 4 – paragraph 1 – point a
(a) testing and adapting, through state of the art appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditions, including the speed and quality of processing notices and reports related to online child sexual abuse and, where appropriate, the expeditious removal of the content notified;
Amendment 739 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
Article 4 – paragraph 1 – point a a (new)
(aa) adapting the design, features and functions of their services in order to ensure a high level of privacy, data protection, safety, and security by design and by default, including some or all of the following: (a) limiting users, by default, to establish direct contact with other users, in particular through private communications; (b) limiting users, by default, to directly share images or videos on services; (c) limiting users, by default, to directly share personal contact details with other users, such as phone numbers, home addresses and e-mail addresses, via rules- based matching; (d) limiting users, by default, to create screenshots or recordings within the service; (e) limiting users, by default, to directly reforward images and videos to other users where no consent has been given; (f) allowing parents of a child or a legal representative of a child to make use of meaningful parental controls tools, which protect the confidentiallity of communications of the child; (g) encouraging children, prior to registring for the service, to talk to their parents about how the service works and what parental controls tools are available. Services taking the measures outlined in this point may allow users to revert such measures on an individual level.
Amendment 763 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communicationrelevant information society services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
Amendment 767 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
Article 4 – paragraph 1 – point c a (new)
(ca) reinforcing awareness-raising measures and adapting their online interface for increased user information, including child-appropriate information targeted to the risk identified;
Amendment 772 #
Proposal for a regulation
Article 4 – paragraph 1 – point c b (new)
Article 4 – paragraph 1 – point c b (new)
(cb) including clearly visible and identifiable information on the minimum age for using the service;
Amendment 773 #
Proposal for a regulation
Article 4 – paragraph 1 – point c c (new)
Article 4 – paragraph 1 – point c c (new)
(cc) initiating targeted measures to protect the rights of the child and tools aimed at helping users to indicate child sexual abuse material and helping children to signal abuse or obtain support;
Amendment 777 #
Proposal for a regulation
Article 4 – paragraph 1 a (new)
Article 4 – paragraph 1 a (new)
1a. Providers of hosting services and providers of interpersonal communications services directly targeting children shall implement the design requirements as specified in the delegated act referred to in paragraph -1 and shall take all mitigation measures as outlined in paragraph 1, point (aa), of this Article to minimise this risk. Such services shall allow users to revert mitigation measures on an individual level.
Amendment 784 #
(a) effective in mitigating the identified serious systemic risk;
Amendment 788 #
Proposal for a regulation
Article 4 – paragraph 2 – point b
Article 4 – paragraph 2 – point b
(b) targeted and proportionate in relation to that serious systemic risk, taking into account, in particular, the seriousness of the risk as well as the provider’s financial and technological capabilitielimitations and the number of users;
Amendment 792 #
Proposal for a regulation
Article 4 – paragraph 2 – point c
Article 4 – paragraph 2 – point c
(c) applied in a diligent and non- discriminatory manner, having due regard, in all circumstances, to the potential consequences of the mitigation measures for the exercise of fundamental rights of all parties affected;, in particular the rights to privacy, protection of data and freedom of expression.
Amendment 809 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measuressafety-by-design measures, including those mentioned in Article 4 paragraph 1 a.
Amendment 819 #
Proposal for a regulation
Article 4 – paragraph 5
Article 4 – paragraph 5
5. The Commission, in cooperation with Coordinating Authorities and, the EU Centre, the European Data Protection Board and the Fundamental Rights Agency, and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used. The European Commission, along with the European Data Protection Board and the Fundamental Rights Agency shall issue guidelines on how providers may implement age verification and age assessment measures, in particular based on selective disclosure of attributes, with full respect for the Charter of Fundamental Rights and Regulation (EU) 2016/679 .
Amendment 823 #
Proposal for a regulation
Article 4 – paragraph 5 a (new)
Article 4 – paragraph 5 a (new)
5a. Prior to the deployment of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
Amendment 835 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
Article 5 – paragraph 1 – point a
(a) the process and the results of the risk assessment conducted or updated pursuant to Article 3, including the assessment of any potential remaining risk referred to in Article 3(5);
Amendment 848 #
Proposal for a regulation
Article 5 – paragraph 6
Article 5 – paragraph 6
Amendment 858 #
Proposal for a regulation
Article 6 – paragraph 1 – introductory part
Article 6 – paragraph 1 – introductory part
1. Providers of software application stores considered as gatekeepers under the Digital Markets Act (EU) 2022/1925 shall:
Amendment 859 #
Proposal for a regulation
Article 6 – paragraph 1 – point a
Article 6 – paragraph 1 – point a
(a) make reasonable efforts to assess, where possible together with the providers of software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the solicitation ofindicate, based on the information provided by the applications developers, if applications contain features that could pose a risk to children;
Amendment 863 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of childrindicate, based on the information provided by the applications developers, if measures have been taken by the application to mitigate risks for children, and which measures have been taken;
Amendment 867 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
Article 6 – paragraph 1 – point c
Amendment 869 #
Proposal for a regulation
Article 6 – paragraph 1 – point c a (new)
Article 6 – paragraph 1 – point c a (new)
(ca) indicate, based on the information provided by the applications developers, the minimum age for using an application, as set out in the terms and conditions of the provider of the application;
Amendment 871 #
Proposal for a regulation
Article 6 – paragraph 2
Article 6 – paragraph 2
Amendment 872 #
Proposal for a regulation
Article 6 – paragraph 3
Article 6 – paragraph 3
Amendment 873 #
Proposal for a regulation
Article 6 – paragraph 4
Article 6 – paragraph 4
4. The Commission, in cooperation with Coordinating Authorities and, the EU Centre, the European Data Protection Board and the Fundamental Rights Agency, and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2 and 3, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.
Amendment 883 #
Proposal for a regulation
Article 7 – title
Article 7 – title
Issuance of detection orderwarrants
Amendment 886 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power toA competent judicial authority may issue, following a request by the competent judicial aCoordinating Authority of the Member State that designated it or another independent administrative authority of that Member State to issuethe judicial authority, a detection orderwarrant requiring a provider of hosting services or a provider of number-independent interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse onmaterial related to specific terminal equipment or a specific uservice account, where there is a reasonable suspicion such content is stored on that terminal equipment or in that user account.
Amendment 904 #
Proposal for a regulation
Article 7 – paragraph 2 – subparagraph 2
Article 7 – paragraph 2 – subparagraph 2
To that end, it may, where appropriate, require the provider to submit the necessary information, additional to the report and the further information referred to in Article 5(1) and (3), respectively, within a reasonable time period set by that Coordinating Authority, or request the EU Centre, another public authority or relevant experts or entities to provide the necessary additional information.
Amendment 910 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point a
Article 7 – paragraph 3 – subparagraph 1 – point a
(a) establish a draft request to the competent judicial authority of the Member State that designated it for the issuance of a detection orderwarrant, specifying the main elements of the content of the detection orderwarrant it intends to request and the reasons for requesting it;
Amendment 913 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point b
Article 7 – paragraph 3 – subparagraph 1 – point b
Amendment 917 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point c
Article 7 – paragraph 3 – subparagraph 1 – point c
Amendment 918 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point d
Article 7 – paragraph 3 – subparagraph 1 – point d
Amendment 921 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point d a (new)
Article 7 – paragraph 3 – subparagraph 1 – point d a (new)
(da) Request the supervisory authorities designated pursuant to Chapter VI, Section 1, of Regulation (EU) 2016/678 to perform their tasks within the competence pursuant to Chapter VI, Section 2 of Regulation (EU) 2016/678 and provide thei opinion on the draft request, within a reasonable time period set by that Coordinating Authority;
Amendment 923 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2
Article 7 – paragraph 3 – subparagraph 2
Amendment 944 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 3
Article 7 – paragraph 3 – subparagraph 3
Where, having regard to the implementation plan of the provider and the opinion of the data protection authority, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 have met, it shall submit the request for the issuance of the detection, adjusted where appropriate, to the competent judicial authority or independent administrative authority. It shall attach the implementation plan of the provider and. It shall attach the opinions of the EU Centre and the data protection authority to that request.
Amendment 959 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a
Article 7 – paragraph 4 – subparagraph 1 – point a
(a) there is evidence of a significant risk of the servicsubstantive evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse, within the meaning of paragraphs 5, 6 and 7, as applicable;
Amendment 962 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a a (new)
Article 7 – paragraph 4 – subparagraph 1 – point a a (new)
(aa) the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter;
Amendment 968 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
Article 7 – paragraph 4 – subparagraph 1 – point b a (new)
(ba) The detection warrant does not affect the security and confidentiality of communications on a general scale.
Amendment 970 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b b (new)
Article 7 – paragraph 4 – subparagraph 1 – point b b (new)
(bb) The technology used to protect the communication, such as any kind of encryption, shall not be affected or undermined by the detection warrant.
Amendment 977 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point -a (new)
Article 7 – paragraph 4 – subparagraph 2 – point -a (new)
(-a) the availability of information to adequately describe the specific purpose and scope of the order, including the legal basis for the suspicion;
Amendment 984 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point c
Article 7 – paragraph 4 – subparagraph 2 – point c
(c) the views, including on the technical feasibility, and the implementation plan of the provider submitted in accordance with paragraph 3;
Amendment 986 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point d
Article 7 – paragraph 4 – subparagraph 2 – point d
(d) the opinions of the EU Centre and of the data protection adata protection authority submitted in accordance with paragraph 3 and, where applicable, the opinion of the Coordinating Authority issubmitted in accordance with Article 5, paragraph 34b.
Amendment 991 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 3
Article 7 – paragraph 4 – subparagraph 3
Amendment 995 #
Proposal for a regulation
Article 7 – paragraph 5
Article 7 – paragraph 5
Amendment 1004 #
Proposal for a regulation
Article 7 – paragraph 6
Article 7 – paragraph 6
Amendment 1010 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1
Article 7 – paragraph 7 – subparagraph 1
Amendment 1013 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 2
Article 7 – paragraph 7 – subparagraph 2
Amendment 1019 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall in accordance with Article 8 of Regulation (EU) 2022/2065 target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary, justifiable and proportionate to effectively address the significant riskreasonable suspicion referred to in point (a) thereof.
Amendment 1024 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 2
Article 7 – paragraph 8 – subparagraph 2
To that aim, they shall take into account all relevant parameters, including the technical feasability, availability of sufficiently reliable detection technologies in that they limit to the maximum extent possible the rate of errors regarding the detection and their suitability and effectiveness for achieving the objectives of this Regulation, as well as the impact of the measures on the rights of the users affected, and require the taking of the least intrusive measures, in accordance with Article 10, from among several equally effective measuresin particular the risk of inaccurately identifying lawful speech as illegal content, as well as the impact of the measures on the rights of the users affected and on the security, integrity and confidentiality of their communications, and require the taking of the least intrusive measures, in accordance with Article 10, from among several equally effective measures. To this end, they shall ensure technologies are able to distinguish between known child abuse material and lawful speech accurately enough that no human intervention is needed.
Amendment 1028 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3
Article 7 – paragraph 8 – subparagraph 3
Amendment 1043 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3
Article 7 – paragraph 9 – subparagraph 3
The period of application of detection orderwarrants concerning the dissemination of known or new child sexual abuse material shall not exceed 24 months and that of detection orders concerning the solicitation of children shall not exceed 126 months.
Amendment 1046 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3 a (new)
Article 7 – paragraph 9 – subparagraph 3 a (new)
The European Data Protection Board shall also issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the detection of child sexual abuse material in encrypted and non-encrypted environments.Supervisory authorities as referred to in that Regulation shall supervise the application of those guidelines. Prior to the use of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
Amendment 1047 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3 b (new)
Article 7 – paragraph 9 – subparagraph 3 b (new)
The competent supervisory authorities designated pursuant to Chapter VI, Section 1, of Regulation (EU) 2016/678 shall have the right to challenge a detection warrant within the competence pursuant to Chapter VI, Section 2 of Regulation (EU) 2016/678 before the courts of the Member State of the competent judicial authority that issued the detection warrant.
Amendment 1056 #
Proposal for a regulation
Article 8 – title
Article 8 – title
Additional rules regarding detection orderwarrants
Amendment 1061 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the detection orderwarrants referred to in Article 7 using the template set out in Annex I. Detection orderwarrants shall include:
Amendment 1066 #
Proposal for a regulation
Article 8 – paragraph 1 – point a a (new)
Article 8 – paragraph 1 – point a a (new)
(aa) information, with respect to each device or user account, detailing the specific purpose and scope of the warrant, including the legal basis for the reasonable suspicion.
Amendment 1072 #
Proposal for a regulation
Article 8 – paragraph 1 – point e
Article 8 – paragraph 1 – point e
Amendment 1077 #
Proposal for a regulation
Article 8 – paragraph 1 – point g
Article 8 – paragraph 1 – point g
(g) a sufficiently detailed statement of reasjustifications explaining why the detection orderwarrant is issued and how it is necessary, effective and proportionate;
Amendment 1102 #
Proposal for a regulation
Article 9 – title
Article 9 – title
9 Redress, information, reporting and modification of detection orderwarrants
Amendment 1105 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of hosting services and providers of number-independent interpersonal communications services that have received a detection orderwarrant, as well as users affected by the measures taken to execute it, shall have a right to information and effective redress. That right shall include the right to challenge the detection orderwarrant before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
Amendment 1130 #
Proposal for a regulation
Article 10 – paragraph 1
Article 10 – paragraph 1
1. Providers of hosting services and providers of number-independent interpersonal communication services that have received a detection orderwarrant shall execute it by installing and operating technologiessecure and privacy-friendly technologies, approved by the Centre, to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46.
Amendment 1145 #
Proposal for a regulation
Article 10 – paragraph 3 – point a
Article 10 – paragraph 3 – point a
(a) effective in detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
Amendment 1147 #
Proposal for a regulation
Article 10 – paragraph 3 – point b
Article 10 – paragraph 3 – point b
(b) not be able to extract any other information from the relevant communications than the information strictly necessary to detect, using the indicators referred to in paragraph 1, patterns pointing to the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
Amendment 1149 #
Proposal for a regulation
Article 10 – paragraph 3 – point c
Article 10 – paragraph 3 – point c
(c) in accordance with the state of the art in the industry and the least intrusive in terms of the impact on the users’ rights to private and family life, including the confidentiality of communication, and to protection of personal data. It shall not weaken or undermine end-to-end encryption and shall not limit providers of information society services from providing their services applying end-to- end encryption;
Amendment 1158 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
Article 10 – paragraph 3 – point d a (new)
(da) ensure that the interference with the fundamental right to privacy and the other rights laid down in the Charter is limited to what is strictly necessary.
Amendment 1169 #
Proposal for a regulation
Article 10 – paragraph 4 – point -a (new)
Article 10 – paragraph 4 – point -a (new)
(-a) ensure privacy by design and safety-by-design and by default and, where applicable, the protection of encryption.
Amendment 1179 #
(c) ensure regularcontinuous human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, immediate human intervention;
Amendment 1183 #
Proposal for a regulation
Article 10 – paragraph 4 – point d
Article 10 – paragraph 4 – point d
(d) establish and operate an accessible, age-appropriate and user- and child- friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
Amendment 1184 #
Proposal for a regulation
Article 10 – paragraph 4 – point e
Article 10 – paragraph 4 – point e
(e) inform the Coordinating Authority and competent Data Protection Authority, at the latest one month before the start date specified in the detection order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3);
Amendment 1187 #
Proposal for a regulation
Article 10 – paragraph 4 – point e a (new)
Article 10 – paragraph 4 – point e a (new)
(ea) request in respect of any specific technology used for the purpose set out in this Article, a prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679, and request a prior consultation procedure as referred to in Article 36 of that Regulation have been conducted;
Amendment 1190 #
Proposal for a regulation
Article 10 – paragraph 4 a (new)
Article 10 – paragraph 4 a (new)
4a. in respect of any specific technology used for the purpose set out in this Article, conduct a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation;
Amendment 1192 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point a
Article 10 – paragraph 5 – subparagraph 1 – point a
(a) the fact that it operates technologies to detect onlineknown child sexual abuse material to execute the detection orderwarrant, the ways in which it operates those technologies and the impact on the users’ fundamental rights to private and family life, including the confidentiality of users’ communications and the protection of personal data;
Amendment 1212 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having consulted the European Data Protection Board and having conducted a public consultation, may issue guidelines on the application of Articles 7 to 10, having due regard in particular to relevant technological developments and the manners in which the services covered by those provisions are offered and used.
Amendment 1215 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of number- independent interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potentiallleged online child sexual abuse on its services, it shall promptly report, without delay, that abuse to the competent law enforcement and independent judicial authorities and submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
Amendment 1229 #
Proposal for a regulation
Article 12 – paragraph 3
Article 12 – paragraph 3
3. The provider shall establish and operate an accessible, age-appropriat, the EU centre, the competent authority or any judicial enforcement bodies , shall, without undue delay, notify the aind user-friendly mechanismividual or entity that have notified thate allows users to flag to the provider potential online child sexual abuseeged online child sexual abuse, of their decision in respect of the information to which the notified content relates, providing information on the possibilities for redress in respect onf the serviceat decision.
Amendment 1235 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of hosting services and providers of number-independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 1240 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
Article 13 – paragraph 1 – point c
(c) all content data, including images, videos and text;
Amendment 1243 #
Proposal for a regulation
Article 13 – paragraph 1 – point c a (new)
Article 13 – paragraph 1 – point c a (new)
(ca) where applicable, an exact uniform resource locator and, where necessary, additional information for the identification of the child sexual abuse material;
Amendment 1246 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
Amendment 1253 #
Proposal for a regulation
Article 13 – paragraph 1 – point f
Article 13 – paragraph 1 – point f
Amendment 1265 #
Proposal for a regulation
Article 13 – paragraph 1 – point j a (new)
Article 13 – paragraph 1 – point j a (new)
(ja) information on the tools used by the provider to become aware of the reported online child sexual abuse, including data and aggregate statistics on how technologies used by the provider work;
Amendment 1268 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a removal order requiring a provider of hosting services under the jurisdiction of the Member State that designated that Coordinating Authority to remove or disable access in all Member States of one or more specific items of material that, after a diligent assessment, the Coordinating Authority or the courts or other independent administrative authorities referred to in Article 36(1) identified as constituting child sexual abuse material.
Amendment 1273 #
Proposal for a regulation
Article 14 – paragraph 2
Article 14 – paragraph 2
2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider.
Amendment 1286 #
1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the removal order.
Amendment 1293 #
Proposal for a regulation
Chapter II – Section 5
Chapter II – Section 5
Amendment 1297 #
Proposal for a regulation
Article 16
Article 16
Amendment 1312 #
Proposal for a regulation
Article 17
Article 17
Amendment 1321 #
Proposal for a regulation
Article 18
Article 18
Amendment 1331 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
Providers of relevant information society services shall not be liable for child sexual abuse offences solely because they carry out, in good faith, the necessary activities to comply with the requirements of this Regulation, in particular activities aimed at detecting, identifying, removing, disabling of access to, blocklabelling, or reporting online child sexual abuse in accordance with those requirements.
Amendment 1333 #
Proposal for a regulation
Article 20 – title
Article 20 – title
20 Victims’Survivors' right to information amd support
Amendment 1337 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1
Article 20 – paragraph 1 – subparagraph 1
Amendment 1342 #
Proposal for a regulation
Article 20 – paragraph 1 – subparagraph 1 a (new)
Article 20 – paragraph 1 – subparagraph 1 a (new)
The Coordinating Authority shall ensure that survivors, including child survivors and parents of child survivors, are informed about survivor support services where the survivors can receive age- appropriate and gender-sensitive information and support.
Amendment 1351 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
Article 20 – paragraph 3 – point d a (new)
(da) information regarding age- appropriate and gender-sensitive survivor support services to provide the child, family and survivors with adequate emotional and psychosocial support as well as practical and legal assistance.
Amendment 1355 #
Proposal for a regulation
Article 21 – title
Article 21 – title
Amendment 1359 #
Proposal for a regulation
Article 21 – paragraph 1
Article 21 – paragraph 1
1. Providers of hostingrelevant information society services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 1367 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
Amendment 1374 #
Proposal for a regulation
Article 21 a (new)
Article 21 a (new)
Article21a Right to lodge a complaint with a supervisory authority 1. Without prejudice to any other administrative or judicial remedy, every user shall have the right to lodge a complaint with a supervisory authority, in particular in the Member State of his or her habitual residence, place of work or place of the alleged infringement if the user considers that the processing of personal data relating to him or her infringes this Regulation or Regulation (EU) 2016/679. 2. The supervisory authority with which the complaint has been lodged shall inform the complainant on the progress and the outcome of the complaint including the possibility of a judicial remedy pursuant to Article 21b.
Amendment 1375 #
Proposal for a regulation
Article 21 b (new)
Article 21 b (new)
Article21b Right to an effective judicial remedy against a provider of a hosting services or a providers of a number-independent interpersonal communications service 1. Without prejudice to any available administrative or non-judicial remedy, including the right to lodge a complaint with a supervisory authority pursuant to 21a, each user shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation or Regulation (EU) 2016/679. 2. Proceedings against a provider of a hosting service or a provider of a number- independent interpersonal communications service shall be brought before the courts of the Member State where the provider has an establishment. Alternatively, such proceedings may be brought before the courts of the Member State where the user has his or her habitual residence.
Amendment 1377 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Providers of hosting services and providers of number-independent interpersonal communications services shall preserve the content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
Amendment 1384 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 2
Article 22 – paragraph 1 – subparagraph 2
Amendment 1402 #
Proposal for a regulation
Article 25 – paragraph 5
Article 25 – paragraph 5
5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available and, shall disseminate this information through gender-sensitive awareness raising campaigns in public places frequented by children, and girls in particular, and shall communicate it to the EU Centre. They shall keep that information updated.
Amendment 1415 #
Proposal for a regulation
Article 25 – paragraph 7 – point d a (new)
Article 25 – paragraph 7 – point d a (new)
(da) provide knowledge and expertise on appropriate prevention techniques tailored by age and gender against online solicitation of children and the dissemination of child sexual abuse material online.
Amendment 1417 #
Proposal for a regulation
Article 25 – paragraph 8 a (new)
Article 25 – paragraph 8 a (new)
8a. The EU Centre shall support Member States in designing preventive and gender-sensitive measures, such as awareness-raising campaigns to combat child sexual abuse, guaranteeing comprehensive sexuality and relationships education in all schools, introducing digital skills, literacy and safety online programs in formal education, ensuring the full availability of specialized support services tailored by gender and age for child survivors of sexual abuse and children in vulnerable situations.
Amendment 1418 #
Proposal for a regulation
Article 25 – paragraph 9 a (new)
Article 25 – paragraph 9 a (new)
9a. In its contact with survivors or in any decision affecting survivors, the Coordinating Authority shall operate in an age-appropriate and gender-sensitive way that minimises risks to survivors, especially children, addresses harm of survivors and meets their needs. It shall operate in a victim and gender sensitive manner which prioritises recognising and listening to the survivor, avoids secondary victimisation and retraumatisation, and systematically focuses on their safety, rights, well-being, expressed needs and choices, and ensures they are treated in an empathetic, sensitive and non- judgmental way.
Amendment 1478 #
Proposal for a regulation
Article 35 – paragraph 4 a (new)
Article 35 – paragraph 4 a (new)
4a. Member States shall ensure that penalties imposed for the infringement of this Regulation do not encourage the over reporting or the removal of material which does not constitute child sexual abuse material.
Amendment 1479 #
Proposal for a regulation
Article 35 a (new)
Article 35 a (new)
Article35a Compensation Users and any body, organisation or association mandated to exercise the rights conferred by this Regulation on their behalf shall have the right to seek, in accordance with Union and national law, compensation from providers of relevant information society services, for any damage or loss suffered due to an infringement by those providers of their obligations under this Regulation.
Amendment 1510 #
Proposal for a regulation
Article 38 – paragraph 1 – subparagraph 1
Article 38 – paragraph 1 – subparagraph 1
Coordinating Authorities shall share best practice standards and guidance on the detection and removal of child sexual abuse material and may participate in joint investigations, which may be coordinated with the support of the EU Centre, of matters covered by this Regulation, concerning providers of relevant information society services that offer their services in several Member States. Those joint investigations shall also take place on the darkweb.
Amendment 1514 #
Proposal for a regulation
Article 38 – paragraph 2 a (new)
Article 38 – paragraph 2 a (new)
2a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of online child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to remove content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
Amendment 1525 #
Proposal for a regulation
Article 39 – paragraph 3 a (new)
Article 39 – paragraph 3 a (new)
Amendment 1539 #
Proposal for a regulation
Article 42 – paragraph 1
Article 42 – paragraph 1
The seat of the EU Centre shall be The Hague, The Netherlandchoice of the location of the seat of the Centre shall be made in accordance with the ordinary legislative procedure, based on the following criteria: (a) it shall not affect the Centre’s execution of its tasks and powers, the organisation of its governance structure, the operation of its main organisation, or the main financing of its activities; (b) it shall ensure that the Centre is able to recruit the high-qualified and specialised staff it requires to perform the tasks and exercise the powers provided by this Regulation; (c) it shall ensure that it can be set up on site upon the entry into force of this Regulation; (d) it shall ensure appropriate accessibility of the location, the existence of adequate education facilities for the children of staff members, appropriate access to the labour market, social security and medical care for both children and spouses; (da) it shall ensure a balanced geographical distribution of EU institutions, bodies and agencies across the Union; (db) it shall ensure its national Child Sexual Abuse framework is of a proven quality and repute, and shall benefit from the experience of national authorities; (dc) it shall enable adequate training opportunities for combating child sexual abuse activities; (dd) it shall enable close cooperation with EU institutions, bodies and agencies but it shall be independent of any of the aforementioned; (de) it shall ensure sustainability and digital security and connectivity with regards to physical and IT infrastructure and working conditions.
Amendment 1553 #
Proposal for a regulation
Article 43 – paragraph 1 – point 2
Article 43 – paragraph 1 – point 2
Amendment 1572 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point a
Article 43 – paragraph 1 – point 6 – point a
(a) collecting, recording, analysing and providing gender and age specific information, providing analysis based on anonymised and non-personal data gathering, including gender and age disaggregated data, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51;
Amendment 1575 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b
Article 43 – paragraph 1 – point 6 – point b
(b) supporting the development and dissemination of research and expertise on those matters and on assistance to victimssurvivors, taking into account the gender dimension, including by serving as a hub of expertise to support evidence-based policy;
Amendment 1577 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) providing technical expertise and promoting the exchange of best practices among Member States on raising awareness for the prevention of child sexual abuse online in formal and non- formal education. Such efforts shall be age-appropriate and gender-sensitive;
Amendment 1582 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b b (new)
Article 43 – paragraph 1 – point 6 – point b b (new)
(bb) exchanging best practices among Coordinating Authorities regarding the available tools to reduce the risk of children becoming victims of sexual abuse and to provide specialized assistance to survivors, in an age-appropriate and gender-sensitive way.
Amendment 1585 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b c (new)
Article 43 – paragraph 1 – point 6 – point b c (new)
(bc) referring survivors to appropriate child protection services;
Amendment 1587 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c a (new)
Article 43 – paragraph 1 – point 6 – point c a (new)
Amendment 1588 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c b (new)
Article 43 – paragraph 1 – point 6 – point c b (new)
(cb) create and oversee an "EU hashing list of known child sexual abuse material" and modify the content of that list, independently and autonomously and free of political, government or industry influence or interference;
Amendment 1589 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c c (new)
Article 43 – paragraph 1 – point 6 – point c c (new)
(cc) develop, in accordance with the implementing act as referred to in Article 43a, the European Centralised Helpline for Abuse of Teenagers (eCHAT), interconnecting via effective interoperability the national hotline's helplines, allowing children to reach out 24/7 via a recognisable central helpline in an anonymous way in their own language and free of charge;
Amendment 1590 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c d (new)
Article 43 – paragraph 1 – point 6 – point c d (new)
(cd) dispose over the resources needed to develop, where possible, open source, hashing technology tools for small and medium sized relevant information society services to prevent the dissemination of known child sexual abuse material in publicly accessible content.
Amendment 1591 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c e (new)
Article 43 – paragraph 1 – point 6 – point c e (new)
(ce) coordinate sharing and filter of Suspicious Activity Reports on alleged "known child sexual abuse material", operating independently, autonomously, free of political, government or industry influence or interference and in full respect of fundamental rights, including privacy and data protection. [By 1 year after entry into force] the Commission shall adopt a delegated act laying down requirements for a Suspicious Activy Reports format, as referred to in this paragraph, and the differentiation between actionable and non-actionable Suspicious Activity Reports. This delegated act shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way.
Amendment 1592 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c f (new)
Article 43 – paragraph 1 – point 6 – point c f (new)
(cf) scan public servers and public communications channels for known child sexual abuse material, with proven technology, solely for the purposes of amending the EU Hashing List and flagging the content for removal to the service provider of the specific public server or public communications channel, without prejudice to Art. -3. The European Data Protection Board shall issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the purpose of scanning.
Amendment 1597 #
Proposal for a regulation
Article 43 a (new)
Article 43 a (new)
Article43a Implementing act for the interconnection of helplines 1. The national helpline referred to in Article 43 shall be interconnected via the European Centralised Helpline for Abuse of Teenagers (eCHAT) to be developed and operated by the EU Centre by ... [two years after the date of entry into force of this Regulation] 2. The Commission shall be empowered to adopt, by means of implementing acts, technical specifications and procedures necessary to provide for the interconnection of national hotlines' online chat systems via eCHAT in accordance with Article 43 with regard to: (a) the technical data necessary forthe eCHAT system to perform itsfunctions and the method of storage, useand protection of that technical data; (b) the common criteria according to which national helplines shall be available through the system of interconnection of helplines; (c) the technical details on how helplines shall be madeavailable; (d) the technical conditions of availability of services provided by the system of interconnection of helplines. Those implementing acts shall be adopted in accordance with the examination procedure referred to in Article 5 of Regulation (EU) 182/2011. 3. When adopting the implementingacts referred to in paragraph 2, the Commission shall take into account proven technology and existing practices.
Amendment 1603 #
Proposal for a regulation
Article 44 – paragraph 1 – point b
Article 44 – paragraph 1 – point b
Amendment 1606 #
Proposal for a regulation
Article 44 – paragraph 1 – point c
Article 44 – paragraph 1 – point c
Amendment 1704 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 3
Article 50 – paragraph 1 – subparagraph 3
Before including specific technologies on those lists, the EU Centre shall request the opinion of its Technology Committee, the Experts Consultative Forum, and of the European Data Protection Board. The Technology Committee and the European Data Protection Board shall deliver their respective opinions within eight weeks. That period may be extended by a further six weeks where necessary, taking into account the complexity of the subject matter. The Technology Committee and the European Data Protection Board shall inform the EU Centre of any such extension within one month of receipt of the request for consultation, together with the reasons for the delay. Where the EU Centre substantially deviates from those opinions, it shall inform the Technology Committee or the European Data Protection Board and the Commission thereof, specifying the points at which it deviated and the main reasons for the deviation.
Amendment 1716 #
Proposal for a regulation
Article 50 – paragraph 5
Article 50 – paragraph 5
5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. Communication campaigns shall be easily understandable and accessible to all children, their families and educators in formal, and non-formal education in the Union, aiming to improve digital literacy and ensure a safe digital environment for children. Communication campaigns shall take into account the gender dimension of the crime.
Amendment 1742 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 1
Article 53 – paragraph 2 – subparagraph 1
Amendment 1745 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 2
Article 53 – paragraph 2 – subparagraph 2
Amendment 1753 #
Proposal for a regulation
Article 53 – paragraph 3
Article 53 – paragraph 3
3. The terms of cooperation and working arrangements shall be laid down in a publically accessible memorandum of understanding.
Amendment 1762 #
Proposal for a regulation
Article 55 – paragraph 1 – introductory part
Article 55 – paragraph 1 – introductory part
The administrative and management structure of the EU Centre shall be gender- balanced and comprise:
Amendment 1764 #
Proposal for a regulation
Article 55 – paragraph 1 – point d a (new)
Article 55 – paragraph 1 – point d a (new)
(da) a Fundamental Rights Officer, which shall exercise the tasks set out in Article 66b;
Amendment 1765 #
Proposal for a regulation
Article 55 – paragraph 1 – point d b (new)
Article 55 – paragraph 1 – point d b (new)
(db) an Expert's Consultative Forum, which shall exercise the tasks set out in Article 66a;
Amendment 1767 #
Proposal for a regulation
Article 56 – paragraph 1
Article 56 – paragraph 1
1. The Management Board shall be gender-balanced and composed of one representative from each Member State and two representatives of the Commission, all as members with voting rights.
Amendment 1775 #
Proposal for a regulation
Article 56 – paragraph 4
Article 56 – paragraph 4
4. Members of the Management Board and their alternates shall be appointed in the light of their knowledge in the field of combating child sexual abuse, taking into account relevant managerial, administrative and budgetary skills. Member States shall appoint a representative of their Coordinating Authority, within four months of [date of entry into force of this Regulation]. All parties represented in the Management Board shall make efforts to limit turnover of their representatives, in order to ensure continuity of its work. All parties shall aim to achieve a balanced representationensure that gender balance between men and women is achieved on the Management Board with at least 40% of candidates of each sex.
Amendment 1779 #
Proposal for a regulation
Article 57 – paragraph 1 – point f
Article 57 – paragraph 1 – point f
(f) appoint the members of the Technology Committee, the Expert's Consultative Forum and of any other advisory group it may establish;
Amendment 1780 #
Proposal for a regulation
Article 57 – paragraph 1 – point f a (new)
Article 57 – paragraph 1 – point f a (new)
(fa) appoint a Data Protection Officer;
Amendment 1781 #
Proposal for a regulation
Article 57 – paragraph 1 – point f b (new)
Article 57 – paragraph 1 – point f b (new)
(fb) appoint a Fundamental Rights Officer;
Amendment 1786 #
Proposal for a regulation
Article 61 – paragraph 1 – subparagraph 1
Article 61 – paragraph 1 – subparagraph 1
The Executive Board shall be gender- balanced and composed of the Chairperson and the Deputy Chairperson of the Management Board, two other members appointed by the Management Board from among its members with the right to vote and two representatives of the Commission to the Management Board. The Chairperson of the Management Board shall also be the Chairperson of the Executive Board. The composition of the Executive Board shall take into consideration gender balance with at least 40% is of each sex.
Amendment 1806 #
Proposal for a regulation
Article 66 a (new)
Article 66 a (new)
Amendment 1807 #
Proposal for a regulation
Chapter IV – Section 5 – Part 3 a (new)
Chapter IV – Section 5 – Part 3 a (new)
Amendment 1877 #
Proposal for a regulation
Article 84 – paragraph 1
Article 84 – paragraph 1
1. Each provider of relevant information society services shall draw up an annual report on its activities under this Regulation. That report shall compile the information referred to in Article 83(1). The providers shall, by 31 January of every year subsequent to the year to which the report relates, make the report available to the public in a machine-readable format and communicate it to the Coordinating Authority of establishment, the Commission and the EU Centre.
Amendment 1879 #
Proposal for a regulation
Article 84 – paragraph 1 a (new)
Article 84 – paragraph 1 a (new)
1a. The annual report shall also include the following information: (a) the number and subject matter of detection orders and removal orders to act against alleged online child sexual abuse and the number of notifications received in accordance with Article 32 and the effects given to those orders; (b) the number of notifications and requests received pursuant to Articles 8a and 35a and an overview of their follow- up; (c) information on the effectiveness of the different technologies used and on the false positive and false negative rates of those technologies, as well as statistics on appeals and the effect they have on the users of its services and information of the effectiveness of the measures and obligations under Articles 3, 4, 5 and 7. (d) information on the tools used by the provider to become aware of the reported online child sexual abuse, including data and aggregate statistics on how technologies used by the provider work.
Amendment 1883 #
Proposal for a regulation
Article 86 – paragraph 2
Article 86 – paragraph 2
2. The power to adopt delegated acts referred to in Articles 3, 8, 13, 14, 17, 47 and 84 shall be conferred on the Commission for an indeterminate period of time from [date of adoption of the Regulation] period of 5 years from [date of adoption of the Regulation]. The Commission shall draw up a report in respect of the delegation of power not later than 9 months before the end of the five-year period. The delegation of power shall be tacitly extended for periods of an identical duration, unless the European Parliament or the Council opposes such extension not later than 3 months before the end of each period.
Amendment 1885 #
Proposal for a regulation
Article 89 – paragraph 3
Article 89 – paragraph 3
This Regulation shall be binding in its entirety and directly applicable in all Member States. As from August 2024, if there is no entry into force of the proposed regulation, the regime in place shall be the one of the interim derogation, until such adoption is envisaged but no later than January 2025.
Amendment 1888 #
Proposal for a regulation
Annex I – title
Annex I – title
DETECTION ORDERWARRANT ISSUED IN ACCORDANCE WITH REGULATION (EU) …/… LAYING DOWN RULES TO PREVENT AND COMBAT CHILD SEXUAL ABUSE (‘THE REGULATION’)
Amendment 1889 #
Proposal for a regulation
Annex I – Section 1 – paragraph 2 – introductory part
Annex I – Section 1 – paragraph 2 – introductory part
Name of the competent judicial authority or the independent administrative authority having issued the detection orderwarrant:
Amendment 1890 #
Proposal for a regulation
Annex I – Section 4 – paragraph 2 – point 2
Annex I – Section 4 – paragraph 2 – point 2
Amendment 1893 #
Proposal for a regulation
Annex I – Section 4 – paragraph 2 – point 3
Annex I – Section 4 – paragraph 2 – point 3
Amendment 1895 #
Proposal for a regulation
Annex II – title
Annex II – title
TEMPLATE FOR INFORMATION ABOUT THE IMPOSSIBILITY TO EXECUTE THE DETECTION ORDERWARRANT referred to in Article 8(3) of Regulation (EU) .../… [laying down rules to prevent and combat child sexual abuse]
Amendment 1898 #
Proposal for a regulation
Annex III – Section 2 – point 2 – point 2
Annex III – Section 2 – point 2 – point 2
Amendment 1900 #
Proposal for a regulation
Annex III – Section 2 – point 2 – point 3
Annex III – Section 2 – point 2 – point 3
Amendment 1902 #
Proposal for a regulation
Annex III – Section 2 – point 3 – introductory part
Annex III – Section 2 – point 3 – introductory part
3) Content data related to the reported potential online child sexual abuse, including images, and videos and texts, as applicable:
Amendment 1903 #
Proposal for a regulation
Annex III – Section 2 – point 4
Annex III – Section 2 – point 4
Amendment 1907 #
Proposal for a regulation
Annex VII
Annex VII
Amendment 1909 #
Proposal for a regulation
Annex VIII
Annex VIII