25 Amendments of Jörgen WARBORN related to 2022/0155(COD)
Amendment 302 #
Proposal for a regulation
Recital 4
Recital 4
(4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is effective, well targeted and proportionate and that respects the fundamental rights and privacy of all parties concerned. In view of the fast- changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future- proof manner, so as not to hamper innovation.
Amendment 333 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take effective and reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures2022/2065 may consider to which extent mitigation measures adopted to comply with that obligation. Mitigation measures necessary for the fulfilment of the obligations in this regulation may include the design of online interfaces or parts thereof with the highest level of privacy, safety and security for children by default, the adoapted to comply with that obligation, which may includeation of standards for protection of children, participation in codes of conduct for protecting children, targeted measures to protect the rights of the child, including age verification and-appropriate parental control tools, may also. Enabling flagging and/or notifying mechanisms and self-reporting functionalities, where possible with the use of AI, shall serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 353 #
Proposal for a regulation
Recital 20
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficientthe provider refuses to cooperate by putting in place the mitigating measures aimed to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request, as a measure of last resort, the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services. Such detection orders shall be issued with regards to the technical capacity of the provider, and shall in no way be intrepreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encrypted content and communications.
Amendment 356 #
Proposal for a regulation
Recital 20
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection orders. Such orders should not apply to end-to-end encryption services. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services.
Amendment 365 #
Proposal for a regulation
Recital 21
Recital 21
(21) Furthermore, as parts of those limits and safeguards, detection orders should only be issued after a diligent and objective assessment leading to the finding of a significant risk of the specific service concerned being misused for a given type of online child sexual abuse covered by this Regulation. Such detection orders should as far as possible be restricted and specified, not calling for mass detection. One of the elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection order.
Amendment 373 #
Proposal for a regulation
Recital 23
Recital 23
(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted and specifiedjustified, proportionate and related only to an identifiable part of the specific service, user or group of users, as well as targeted and limited in time so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
Amendment 381 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of eEnd-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children, should be safeguarded. This includes no possibility within end-to-end encryption technology to build in so called ‘backdoors’, i.e. client-side scanning with side-channel leaks which could weaken the end-to-end encryption and lead to a third party getting access to private data. Client-side scanning, when a message is scanned twice, on sending and receiving, threatens the integrity and privacy of users. Such ‘backdoors’ should not be built in on end-to-end encryption in the pursuit of enforcing this regulation. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
Amendment 383 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use ofIn accordance with Article 6a, nothing in this regulation shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encryptied con technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of childrennt or communications through client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provides third party actors with access to the end-to-end encrypted content and communications. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
Amendment 389 #
Proposal for a regulation
Recital 26 a (new)
Recital 26 a (new)
(26a) End-to-end encryption is an essential tool to guarantee the security, privacy and confidentiality of the communications between users, including those of children. Any weakening of the end-to-end encryption's effect could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communications shall be understood the processing of any data, that would compromise or put at risk the integrity and confidentiality of the aforementioned end-to-end encrypted content. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side-channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provide third party actors access to the end-to-end encrypted content and communications.
Amendment 391 #
Proposal for a regulation
Recital 26 a (new)
Recital 26 a (new)
(26a) End-to-end encryption is vital for the security and privacy of the communications of users. The detection obligations set out in this regulation should therefore not apply to end-to-end encryption services, since it risks jeopardizing the integrity of such services. Consequently, the encryption should remain confidential without the possibility of side channel-leak mechanism built in from the service providers, which would endanger the privacy of users.
Amendment 397 #
Proposal for a regulation
Recital 27
Recital 27
(27) In order to facilitate the providers’ compliance with the detection obligations, the EU Centre should make available to providers detection technologies that they may choose to use, on a free-of-charge basis, for the sole purpose of executing the detection orders addressed to them. The European Data Protection Board shouldmust be consulted on those technologies and the ways in which they should be best deployed to ensure compliance with applicable rules of Union law on the protection of personal data. The advice of the European Data Protection Board shouldmust be taken into account by the EU Centre when compiling the lists of available technologies and also by the Commission when preparing guidelines regarding the application of the detection obligations. The providers may operate the technologies made available by the EU Centre or by others or technologies that they developed themselves, as long as they meet the requirements of this Regulation.
Amendment 651 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
Article 3 – paragraph 2 – point b – indent 4 a (new)
- functionalities enabling age- appropriate parental controls, including with the use of AI;
Amendment 653 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 b (new)
Article 3 – paragraph 2 – point b – indent 4 b (new)
- functionalities enabling self- reporting, including with the use of AI;
Amendment 732 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, taking into account the right to private life and personal data protection, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:
Amendment 744 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
Article 4 – paragraph 1 – point a a (new)
(aa) providing security by design, as a way to ensuring services that are safe and secure, especially for children;
Amendment 747 #
Proposal for a regulation
Article 4 – paragraph 1 – point a b (new)
Article 4 – paragraph 1 – point a b (new)
(ab) providing several reporting functions within their services, so that users of the services can report and flag content and material;
Amendment 795 #
Proposal for a regulation
Article 4 – paragraph 2 – point c a (new)
Article 4 – paragraph 2 – point c a (new)
(ca) done in a way that does not compromise end-to-end encryption;
Amendment 862 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; or where:
Amendment 864 #
Proposal for a regulation
Article 6 – paragraph 1 – point b – point i (new)
Article 6 – paragraph 1 – point b – point i (new)
i) the developer of the software application has decided and informed the software application store that its terms and conditions of use do not permit child users,
Amendment 865 #
Proposal for a regulation
Article 6 – paragraph 1 – point b – point ii (new)
Article 6 – paragraph 1 – point b – point ii (new)
ii) the software application has an appropriate age rating model in place, or
Amendment 866 #
Proposal for a regulation
Article 6 – paragraph 1 – point b – point iii (new)
Article 6 – paragraph 1 – point b – point iii (new)
iii) the developer of the software application has requested the software application store not to allow child users to download its software applications.
Amendment 875 #
Proposal for a regulation
Article 6 a (new)
Article 6 a (new)
Article6a End-to-end encrypted services Nothing in this Regulation shall be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communcations shall be understood the processing of any data that would compromise or put at risk the integrity and confidentiality of the content and communications in the end- to-end encryption. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communications services provides third party actors access to the end-to-end encrypted content.
Amendment 898 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
1a. Such a detection order shall as far as possible be restricted and specified, not calling for mass detection through the whole services.
Amendment 1017 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall, in accordance with Article 8 of Regulation (EU) 2022/2065, target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b),2 remain limited to what is strictly necessary, justifiable and proportionate to effectively address the significant risk referred to in point (a) thereof, and limit the detection order to an identifiable part or component of a service, such as a specific channel of communication or a specific group of users identified with particularity for which the significant risk has been identified. In accordance with Article 6a, no such detection order shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encrypted content and communications.
Amendment 1204 #
Proposal for a regulation
Article 10 a (new)
Article 10 a (new)
Article10a Safeguarding end-to-end encryption The integrity of end-to-end encryption services must be safeguarded. The detection obligations set out in this section shall therefore not apply to end-to-end encryption services. This includes, inter alia, no possibility within end-to-end encryption technology to build in so called ‘backdoors’ i.e. client-side scanning with side-channel leaks which could weaken the end-to-end encryption and lead to a third part getting access to private data. Client-side scanning, when a message is scanned twice, on sending and receiving, threatens the integrity and privacy of users. Such ‘backdoors’ shall not be built in on end-to-end encryption in the pursuit of enforcing this regulation.