71 Amendments of Christian DOLESCHAL related to 2022/0155(COD)
Amendment 308 #
Proposal for a regulation
Recital 5
Recital 5
(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this Regulation. Online search engines and other artificial intelligence services should also be covered. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate mannerand targeted manner. Considering the fundamental importance of the right to respect for private life and the right to protection of personal data, as guaranteed by the Charter of Fundamental Rights, nothing in this regulation should be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications.
Amendment 333 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take effective and reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures2022/2065 may consider to which extent mitigation measures adopted to comply with that obligation. Mitigation measures necessary for the fulfilment of the obligations in this regulation may include the design of online interfaces or parts thereof with the highest level of privacy, safety and security for children by default, the adoapted to comply with that obligation, which may includeation of standards for protection of children, participation in codes of conduct for protecting children, targeted measures to protect the rights of the child, including age verification and-appropriate parental control tools, may also. Enabling flagging and/or notifying mechanisms and self-reporting functionalities, where possible with the use of AI, shall serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 353 #
Proposal for a regulation
Recital 20
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficientthe provider refuses to cooperate by putting in place the mitigating measures aimed to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request, as a measure of last resort, the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services. Such detection orders shall be issued with regards to the technical capacity of the provider, and shall in no way be intrepreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encrypted content and communications.
Amendment 373 #
Proposal for a regulation
Recital 23
Recital 23
(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted and specifiedjustified, proportionate and related only to an identifiable part of the specific service, user or group of users, as well as targeted and limited in time so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
Amendment 383 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use ofIn accordance with Article 6a, nothing in this regulation shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encryptied con technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of childrennt or communications through client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provides third party actors with access to the end-to-end encrypted content and communications. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
Amendment 389 #
Proposal for a regulation
Recital 26 a (new)
Recital 26 a (new)
(26a) End-to-end encryption is an essential tool to guarantee the security, privacy and confidentiality of the communications between users, including those of children. Any weakening of the end-to-end encryption's effect could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communications shall be understood the processing of any data, that would compromise or put at risk the integrity and confidentiality of the aforementioned end-to-end encrypted content. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side-channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provide third party actors access to the end-to-end encrypted content and communications.
Amendment 651 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
Article 3 – paragraph 2 – point b – indent 4 a (new)
- functionalities enabling age- appropriate parental controls, including with the use of AI;
Amendment 653 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 b (new)
Article 3 – paragraph 2 – point b – indent 4 b (new)
- functionalities enabling self- reporting, including with the use of AI;
Amendment 695 #
Proposal for a regulation
Article 3 – paragraph 2 a (new)
Article 3 – paragraph 2 a (new)
2a. The provider, where applicable, shall assess, in a separate section of its risk assessment, the voluntary use of specific technologies for the processing of personal and other data to the extent strictly necessary to detect, to report and to remove online child sexual abuse material from its services. Such voluntary use of specific technologies shall under no circumstances undermine the integrity and confidentiality of end-to-end encrypted content and communcations.
Amendment 705 #
Proposal for a regulation
Article 3 – paragraph 4 – subparagraph 2
Article 3 – paragraph 4 – subparagraph 2
Amendment 862 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; or where:
Amendment 864 #
Proposal for a regulation
Article 6 – paragraph 1 – point b – point i (new)
Article 6 – paragraph 1 – point b – point i (new)
i) the developer of the software application has decided and informed the software application store that its terms and conditions of use do not permit child users,
Amendment 865 #
Proposal for a regulation
Article 6 – paragraph 1 – point b – point ii (new)
Article 6 – paragraph 1 – point b – point ii (new)
ii) the software application has an appropriate age rating model in place, or
Amendment 866 #
Proposal for a regulation
Article 6 – paragraph 1 – point b – point iii (new)
Article 6 – paragraph 1 – point b – point iii (new)
iii) the developer of the software application has requested the software application store not to allow child users to download its software applications.
Amendment 875 #
Proposal for a regulation
Article 6 a (new)
Article 6 a (new)
Article6a End-to-end encrypted services Nothing in this Regulation shall be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communcations shall be understood the processing of any data that would compromise or put at risk the integrity and confidentiality of the content and communications in the end- to-end encryption. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communications services provides third party actors access to the end-to-end encrypted content.
Amendment 876 #
Proposal for a regulation
Article 6 a (new)
Article 6 a (new)
Article6a End-to-end encrypted services Nothing in this Regulation shall be interpreted as prohibiting, weakening or compromising the integrity and confidentiality of end-to-end encrypted content and communications. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provides third party actors access to end-to-end encrypted content. No provider of a hosting service or provider of interpersonal communication services shall be compelled to enable or create access to communcations by means of bypassing user authentication or encryption under the scope of this regulation.
Amendment 882 #
Proposal for a regulation
Article 7 – title
Article 7 – title
Issuance of targeted detection orders
Amendment 888 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a targeted detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect indivdual cases of online child sexual abuse on a specific service. The scope of a targeted detection order shall be limited to individual users or groups of users for whom there is evidence suggesting that their conduct might have a link with child sexual abuse offences.
Amendment 893 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a targeted detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service.
Amendment 900 #
Proposal for a regulation
Article 7 – paragraph 2 – subparagraph 1
Article 7 – paragraph 2 – subparagraph 1
The request of the Coordinating Authority of establishment shall, before requesting the issuance of a detection order, carry out the investigations and assessments necessary to determfor a targeted detection order shall include any evidence suggesting individual or collective conduct that establishes a linek whether the conditions of paragraph 4 have been metith child sexual abuse offences, in particular previous offences.
Amendment 903 #
Proposal for a regulation
Article 7 – paragraph 2 – subparagraph 2
Article 7 – paragraph 2 – subparagraph 2
To that end, it may, where appropriate, require the provider to submit the necessary information, additional to the report and the further information referred to in Article 5(1) and (3), respectively,evidence within a reasonable time period set by that Coordinating Authority, or request the EU Centre, another public authority or relevant experts or entities to provide the necessary additional information.
Amendment 907 #
Proposal for a regulation
Article 7 – paragraph 3
Article 7 – paragraph 3
Amendment 950 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – introductory part
Article 7 – paragraph 4 – subparagraph 1 – introductory part
The Coordinating Authority of establishment shall request the issuance of the detection order, and the competent judicial authority or independent administrative authority shall issue the detection order where it considers that the following conditions are met:
Amendment 953 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – introductory part
Article 7 – paragraph 4 – subparagraph 1 – introductory part
The Coordinating Authority of establishment shall request the issuance of the targeted detection order, and the competent judicial authority or independent administrative authority shall issue the targeted detection order where it considers that the following conditions are met:in accordance with the applicable legal standard for evidence in criminal law.
Amendment 955 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a
Article 7 – paragraph 4 – subparagraph 1 – point a
Amendment 964 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b
Article 7 – paragraph 4 – subparagraph 1 – point b
Amendment 975 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2
Article 7 – paragraph 4 – subparagraph 2
Amendment 997 #
Proposal for a regulation
Article 7 – paragraph 5
Article 7 – paragraph 5
Amendment 1000 #
Proposal for a regulation
Article 7 – paragraph 6
Article 7 – paragraph 6
Amendment 1009 #
Proposal for a regulation
Article 7 – paragraph 7
Article 7 – paragraph 7
Amendment 1016 #
Proposal for a regulation
Article 7 – paragraph 8
Article 7 – paragraph 8
Amendment 1017 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall, in accordance with Article 8 of Regulation (EU) 2022/2065, target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b),2 remain limited to what is strictly necessary, justifiable and proportionate to effectively address the significant risk referred to in point (a) thereof, and limit the detection order to an identifiable part or component of a service, such as a specific channel of communication or a specific group of users identified with particularity for which the significant risk has been identified. In accordance with Article 6a, no such detection order shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encrypted content and communications.
Amendment 1018 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the targeted detection order, shall target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary to effectively address the significant risk referred to in point (a) thereofeffective and proportionate with regards to the applicable standards of criminal law.
Amendment 1037 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 1
Article 7 – paragraph 9 – subparagraph 1
The competent judicial authority or independent administrative authority shall specify in the targeted detection order the period during which it applies, indicating the start date and the end date.
Amendment 1040 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 2
Article 7 – paragraph 9 – subparagraph 2
The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the targeted detection order. It shall not be earlier than three months from the date at which the provider received the targeted detection order and not be later than 12 months from that date.
Amendment 1044 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3
Article 7 – paragraph 9 – subparagraph 3
The period of application of targeted detection orders concerning the dissemination of known or new child sexual abuse material shall not exceed 24 months and that of detection orders concerning the solicitation of children shall not exceed 12 months.
Amendment 1054 #
Proposal for a regulation
Article 8 – title
Article 8 – title
Additional rules regarding targeted detection orders
Amendment 1059 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the detection orders referred to in Article 7 using the template set out in Annex I. D. Targeted detection orders shall include at minimum:
Amendment 1063 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the targeted detection orders referred to in Article 7 using the template set out in Annex I. DTargered detection orders shall include:
Amendment 1065 #
Proposal for a regulation
Article 8 – paragraph 1 – point a
Article 8 – paragraph 1 – point a
(a) information regarding the measures to be taken to execute the detection order, including the indicators to be used and the safeguards to be provided for, including the reporting requirements set pursuant to Article 9(3) and, where applicable, any additional safeguards as referred to in Article 7(8);
Amendment 1068 #
Proposal for a regulation
Article 8 – paragraph 1 – point b
Article 8 – paragraph 1 – point b
(b) identification details of the competent judicial authority or the independent administrative authority issuing the detection order and authentication of the targeted detection order by that judicial or independent administrative authority;
Amendment 1069 #
Proposal for a regulation
Article 8 – paragraph 1 – point c a (new)
Article 8 – paragraph 1 – point c a (new)
(ca) (c) the name of the user(s) for whom a targeted detection order has been issued, insofar it is known, and digital aliases in use by the user(s).
Amendment 1070 #
Proposal for a regulation
Article 8 – paragraph 1 – point d
Article 8 – paragraph 1 – point d
(d) the specific service in respect of which the targeted detection order is issued and, where applicable, the part or component of the service affected as referred to in Article 7(8);
Amendment 1074 #
Proposal for a regulation
Article 8 – paragraph 1 – point e
Article 8 – paragraph 1 – point e
(e) whether the targeted detection order issued concerns the dissemination of known or new child sexual abuse material or the solicitation of children;
Amendment 1076 #
Proposal for a regulation
Article 8 – paragraph 1 – point f
Article 8 – paragraph 1 – point f
(f) the start date and the end date of the targeted detection order;
Amendment 1078 #
Proposal for a regulation
Article 8 – paragraph 1 – point g
Article 8 – paragraph 1 – point g
(g) a sufficiently detailed statement of reasonsevidence explaining why the targeted detection order is issued;
Amendment 1081 #
Proposal for a regulation
Article 8 – paragraph 1 – point h
Article 8 – paragraph 1 – point h
(h) a reference to this Regulation as the legal basis for the targeted detection order;
Amendment 1083 #
Proposal for a regulation
Article 8 – paragraph 1 – point i
Article 8 – paragraph 1 – point i
(i) the date, time stamp and electronic signature of the judicial or independent administrative authority issuing the targeted detection order;
Amendment 1084 #
Proposal for a regulation
Article 8 – paragraph 1 – point j
Article 8 – paragraph 1 – point j
(j) easily understandable information about the redress available to the addressee of the targeted detection order, including information about redress to a court and about the time periods applicable to such redress.
Amendment 1088 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1
Article 8 – paragraph 2 – subparagraph 1
The competent judicial authority or independent administrative authority issuing the targeted detection order shall address it to the main establishment of the provider or, where applicable, to its legal representative designated in accordance with Article 24.
Amendment 1089 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 2
Article 8 – paragraph 2 – subparagraph 2
The targeted detection order shall be transmitted to the provider’s point of contact referred to in Article 23(1), to the Coordinating Authority of establishment and to the EU Centre, through the system established in accordance with Article 39(2).
Amendment 1091 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 3
Article 8 – paragraph 2 – subparagraph 3
The targeted detection order shall be drafted in the language declared by the provider pursuant to Article 23(3).
Amendment 1094 #
Proposal for a regulation
Article 8 – paragraph 3
Article 8 – paragraph 3
3. If the provider cannot execute the detection order because it contains manifest errors or does not contain sufficient information for its execution, the provider shall, without undue delay, request the necessary clarification to the Coordinating Authority of establishment, using the template set out in Annex II.
Amendment 1096 #
Proposal for a regulation
Article 8 – paragraph 4
Article 8 – paragraph 4
4. The Commission shall be empowered to adopt delegated acts in accordance with Article 86 in order to amend Annexes I and I I where necessary to improve the templates in view of relevant technological developments or practical experiences gained.
Amendment 1101 #
Redress, information, reporting and modification of targeted detection orders
Amendment 1106 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services that have received a targeted detection order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the targeted detection order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
Amendment 1111 #
Proposal for a regulation
Article 9 – paragraph 2 – subparagraph 1
Article 9 – paragraph 2 – subparagraph 1
When the targeted detection order becomes final, the competent judicial authority or independent administrative authority that issued the targeted detection order shall, without undue delay, transmit a copy thereof to the Coordinating Authority of establishment. The Coordinating Authority of establishment shall then, without undue delay, transmit a copy thereof to all other Coordinating Authorities through the system established in accordance with Article 39(2).
Amendment 1112 #
Proposal for a regulation
Article 9 – paragraph 2 – subparagraph 2
Article 9 – paragraph 2 – subparagraph 2
For the purpose of the first subparagraph, a targeted detection order shall become final upon the expiry of the time period for appeal where no appeal has been lodged in accordance with national law or upon confirmation of the targeted detection order following an appeal.
Amendment 1115 #
Proposal for a regulation
Article 9 – paragraph 3 – subparagraph 1
Article 9 – paragraph 3 – subparagraph 1
Where the period of application of the detection order exceeds 12 months, or six months in the case of a detection order concerning the solicitation of children, the Coordinating Authority of establishment shall require the provider to report to it on the execution of the detection order at least once, halfway through the period of application.
Amendment 1118 #
Proposal for a regulation
Article 9 – paragraph 3 – subparagraph 2
Article 9 – paragraph 3 – subparagraph 2
Those reports shall include a detailed description of the measures taken to execute the detection order, including the safeguards provided, and information on the functioning in practice of those measures, in particular on their effectiveness in detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, and on the consequences of those measures for the rights and legitimate interests of all parties affected.
Amendment 1120 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 1
Article 9 – paragraph 4 – subparagraph 1
In respect of the targeted detection orders that the competent judicial authority or independent administrative authority issued at its request, the Coordinating Authority of establishment shall, where necessary and in any event following reception of the reports referred to in paragraph 3, assess whether any substantial changes to the grounds for issuing the detection orders occurred and, in particular, whether the conditions of Article 7(4) continue to be met. In that regard, it shall take account of additional mitigation measures that the provider may take to address the significant risk identified at the time of the issuance of the detection orderevidence has been substantiated.
Amendment 1122 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 2
Article 9 – paragraph 4 – subparagraph 2
That Coordinating Authority shall request to the competent judicial authority or independent administrative authority that issued the detection order the modification or revocation of such order, where necessary in the light of the outcome of that assessment. The provisions of this Section shall apply to such requests, mutatis mutandis.
Amendment 1131 #
Proposal for a regulation
Article 10 – paragraph 1
Article 10 – paragraph 1
1. Providers of hosting services and providers of interpersonal communication services that have received a targeted detection order shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided by the EU Centre in accordance with Article 46 and with Article 6a.
Amendment 1139 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. The provider shall be entitled to acquire, install and operate, free of charge, technologies made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the targeted detection order. The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologies.
Amendment 1173 #
Proposal for a regulation
Article 10 – paragraph 4 – point a
Article 10 – paragraph 4 – point a
(a) take all the necessary measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are used for the sole purpose of detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar as strictly necessary to execute the targeted detection orders addressed to them;
Amendment 1186 #
Proposal for a regulation
Article 10 – paragraph 4 – point e
Article 10 – paragraph 4 – point e
(e) inform the Coordinating Authority, at the latest one month before the start date specified in the targeted detection order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3);
Amendment 1195 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point a
Article 10 – paragraph 5 – subparagraph 1 – point a
(a) the fact that it operates technologies to detect online child sexual abuse to execute the targeted detection order, the ways in which it operates those technologies and the impact on the confidentiality of users’ communications;
Amendment 1199 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 2
Article 10 – paragraph 5 – subparagraph 2
The provider shall not provide information to users that may reduce the effectiveness of the measures to execute the targeted detection order, notwithstanding Article 6a and general advice on confidential communication.
Amendment 1202 #
Proposal for a regulation
Article 10 – paragraph 6
Article 10 – paragraph 6
6. Where a provider detects potential online child sexual abuse through the measures taken to execute the targeted detection order, it shall inform the users concerned without undue delay, after Europol or the national law enforcement authority of a Member State that received the report pursuant to Article 48 has confirmed that the information to the users would not interfere with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
Amendment 1314 #
Proposal for a regulation
Article 17 – paragraph 1 – point d
Article 17 – paragraph 1 – point d
(d) the specific service in respect of which the targeted detection order is issued;
Amendment 1886 #
Proposal for a regulation
Annex I
Annex I