95 Amendments of Birgit SIPPEL related to 2022/0155(COD)
Amendment 337 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 358 #
Proposal for a regulation
Recital 20 a (new)
Recital 20 a (new)
(20a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any weakening of encryption could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be prohibiting or weakening end-to-end encryption or be interpreted in that way.
Amendment 360 #
Proposal for a regulation
Recital 20 c (new)
Recital 20 c (new)
(20c) The act of breaking encryption refers to the act of defeating or bypassing the encryption protocol used to secure a communication. Any access by any third- party that was not meant to access, read or edit the content of that communication that was supposed to be private and secure should be considered as undermining encryption.
Amendment 364 #
Proposal for a regulation
Recital 21
Recital 21
(21) Furthermore, as parts of those limits and safeguards, detection orderwarrants should only be issued after a diligent and objective assessment leading to the finding of a significant risk of theby a judicial authority and only with the purpose to detect known online child sexual abuse material related to a specific serdevice concerned being misused for a given type of online child sexual abuse covered by this Regulationor user account, where there is a reasonable suspicion such content is stored on that device or in that user account. One of the main elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection orderexistence of evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse.
Amendment 368 #
Proposal for a regulation
Recital 22
Recital 22
(22) However, the finding of such a significant riskexistence of evidence demonstrating a reasonable suspicion that individual accounts or groups of accounts are being used for the purpose of online child sexual abuse should in itself be insufficient to justify the issuance of a detection orderwarrant, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detection orderwarrants can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the likelihood and seriousnactual or potential implications for the rights and legitimate interests of any potential negative consequences for other parties affecll parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Chartedr. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
Amendment 502 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of number- independent interpersonal communication services to detect and report online child sexual abuse;
Amendment 508 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point c
Article 1 – paragraph 1 – subparagraph 2 – point c
(c) obligations on providers of hosting services to remove or disable access to child sexual abuse material on their services;
Amendment 511 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point d
Article 1 – paragraph 1 – subparagraph 2 – point d
Amendment 524 #
Proposal for a regulation
Article 1 – paragraph 3 – point d
Article 1 – paragraph 3 – point d
(d) Regulation (EU) 2016/679, Directive 2016/680, Regulation (EU) 2018/1725, and, subject to paragraph 4 of this Article, Directive 2002/58/EC.
Amendment 529 #
Proposal for a regulation
Article 1 – paragraph 3 – point d a (new)
Article 1 – paragraph 3 – point d a (new)
(da) Regulation (EU) …/… [laying down harmonised rules on artificial intelligence (Artificial Intelligence Act);
Amendment 532 #
Proposal for a regulation
Article 1 – paragraph 3 a (new)
Article 1 – paragraph 3 a (new)
3a. This Regulation shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption, or be interpreted in that way.
Amendment 534 #
Proposal for a regulation
Article 1 – paragraph 3 b (new)
Article 1 – paragraph 3 b (new)
3b. This Regulation shall not undermine the prohibition of general monitoring under Union law or introduce general data retention obligations, or be interpreted in that way.
Amendment 539 #
Proposal for a regulation
Article 1 – paragraph 4
Article 1 – paragraph 4
4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC insofar as necessary for the execution of the detection orderswith the sole objective of enabling a provider of hosting services, a provider of number-independent interpersonal communications services or a provider of an artifical intelligence system to use specific technologies for the processing of personal data to the extent strictly necessary to detect and report online child sexual abuse and remove child sexual abuse material on their services, following a detection warrant issued in accordance with Section 2 of Chapter 1 of this Regulation.
Amendment 542 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
Article 1 – paragraph 4 a (new)
4a. This Regulation does not apply to audio communications.
Amendment 563 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iii
Article 2 – paragraph 1 – point f – point iii
Amendment 568 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv a (new)
Article 2 – paragraph 1 – point f – point iv a (new)
(iva) an artificial intelligence system;
Amendment 577 #
Proposal for a regulation
Article 2 – paragraph 1 – point j
Article 2 – paragraph 1 – point j
Amendment 599 #
Proposal for a regulation
Article 2 – paragraph 1 – point s
Article 2 – paragraph 1 – point s
(s) ‘content data’ means data as defined in Article 2, point 10, of Regulation (EU) … [on European Production and Preservation Orders for electronic evidence in criminal matters (…/… e-evidence Regulation)]videos and images in a digital format;
Amendment 605 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
Article 2 – paragraph 1 – point w a (new)
(wa) ‘hotline’ means an organisation officially recognised by a Member State, other than the reporting channels provided by law enforcement authorities, for receiving anonymous complaints from victims and the public about alleged child sexual abuse;
Amendment 618 #
Proposal for a regulation
Article 3 – paragraph 1 a (new)
Article 3 – paragraph 1 a (new)
1a. Without prejudice to Regulation (EU) 2022/2065, when conducting the risk assessment, providers of hosting services and providers of interpersonal communications services shall respect and avoid any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity, respect for private and family life, the protection of personal data, freedom of expression and information, including the freedom and pluralism of the media, the prohibition of discrimination, the rights of the child and consumer protection, as enshrined in Articles 1, 7, 8, 11, 21, 24 and 38 of the Charter respectively.
Amendment 636 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
Article 3 – paragraph 2 – point b – indent 3
Amendment 645 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4
Article 3 – paragraph 2 – point b – indent 4
– functionalities enabling users to flag online child sexual abuse to the provider through tools that are easily recognisable, accessible and, age-appropriate and child- and user friendly, including anonymous reporting channels;
Amendment 649 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
Article 3 – paragraph 2 – point b – indent 4 a (new)
- systems and mechanisms that provide child- and user-friendly resources to ensure that children can seek help swiftly, including information on how to contact national child protection organisations or national law enforcement.
Amendment 714 #
Proposal for a regulation
Article 3 – paragraph 5
Article 3 – paragraph 5
Amendment 718 #
Proposal for a regulation
Article 3 – paragraph 6
Article 3 – paragraph 6
6. The Commission, in cooperation with Coordinating Authorities, the European Data Protection Board, the Fundamental Rights Agency and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1 to 5, having due regard in particular to relevant technological developments and to the manners in which the services covered by those provisions are offered and used.
Amendment 724 #
Proposal for a regulation
Article 4 – title
Article 4 – title
4 RSafety-by-design and risk mitigation
Amendment 763 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communicationrelevant information society services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
Amendment 767 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
Article 4 – paragraph 1 – point c a (new)
(ca) reinforcing awareness-raising measures and adapting their online interface for increased user information, including child-appropriate information targeted to the risk identified;
Amendment 772 #
Proposal for a regulation
Article 4 – paragraph 1 – point c b (new)
Article 4 – paragraph 1 – point c b (new)
(cb) including clearly visible and identifiable information on the minimum age for using the service;
Amendment 819 #
Proposal for a regulation
Article 4 – paragraph 5
Article 4 – paragraph 5
5. The Commission, in cooperation with Coordinating Authorities and, the EU Centre, the European Data Protection Board and the Fundamental Rights Agency, and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 4, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used. The European Commission, along with the European Data Protection Board and the Fundamental Rights Agency shall issue guidelines on how providers may implement age verification and age assessment measures, in particular based on selective disclosure of attributes, with full respect for the Charter of Fundamental Rights and Regulation (EU) 2016/679 .
Amendment 823 #
Proposal for a regulation
Article 4 – paragraph 5 a (new)
Article 4 – paragraph 5 a (new)
5a. Prior to the deployment of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
Amendment 835 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
Article 5 – paragraph 1 – point a
(a) the process and the results of the risk assessment conducted or updated pursuant to Article 3, including the assessment of any potential remaining risk referred to in Article 3(5);
Amendment 848 #
Proposal for a regulation
Article 5 – paragraph 6
Article 5 – paragraph 6
Amendment 1056 #
Proposal for a regulation
Article 8 – title
Article 8 – title
Additional rules regarding detection orderwarrants
Amendment 1061 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the detection orderwarrants referred to in Article 7 using the template set out in Annex I. Detection orderwarrants shall include:
Amendment 1066 #
Proposal for a regulation
Article 8 – paragraph 1 – point a a (new)
Article 8 – paragraph 1 – point a a (new)
(aa) information, with respect to each device or user account, detailing the specific purpose and scope of the warrant, including the legal basis for the reasonable suspicion.
Amendment 1072 #
Proposal for a regulation
Article 8 – paragraph 1 – point e
Article 8 – paragraph 1 – point e
Amendment 1077 #
Proposal for a regulation
Article 8 – paragraph 1 – point g
Article 8 – paragraph 1 – point g
(g) a sufficiently detailed statement of reasjustifications explaining why the detection orderwarrant is issued and how it is necessary, effective and proportionate;
Amendment 1105 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of hosting services and providers of number-independent interpersonal communications services that have received a detection orderwarrant, as well as users affected by the measures taken to execute it, shall have a right to information and effective redress. That right shall include the right to challenge the detection orderwarrant before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detection order.
Amendment 1145 #
Proposal for a regulation
Article 10 – paragraph 3 – point a
Article 10 – paragraph 3 – point a
(a) effective in detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
Amendment 1147 #
Proposal for a regulation
Article 10 – paragraph 3 – point b
Article 10 – paragraph 3 – point b
(b) not be able to extract any other information from the relevant communications than the information strictly necessary to detect, using the indicators referred to in paragraph 1, patterns pointing to the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
Amendment 1149 #
Proposal for a regulation
Article 10 – paragraph 3 – point c
Article 10 – paragraph 3 – point c
(c) in accordance with the state of the art in the industry and the least intrusive in terms of the impact on the users’ rights to private and family life, including the confidentiality of communication, and to protection of personal data. It shall not weaken or undermine end-to-end encryption and shall not limit providers of information society services from providing their services applying end-to- end encryption;
Amendment 1158 #
Proposal for a regulation
Article 10 – paragraph 3 – point d a (new)
Article 10 – paragraph 3 – point d a (new)
(da) ensure that the interference with the fundamental right to privacy and the other rights laid down in the Charter is limited to what is strictly necessary.
Amendment 1169 #
Proposal for a regulation
Article 10 – paragraph 4 – point -a (new)
Article 10 – paragraph 4 – point -a (new)
(-a) ensure privacy by design and safety-by-design and by default and, where applicable, the protection of encryption.
Amendment 1183 #
Proposal for a regulation
Article 10 – paragraph 4 – point d
Article 10 – paragraph 4 – point d
(d) establish and operate an accessible, age-appropriate and user- and child- friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of its obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
Amendment 1184 #
Proposal for a regulation
Article 10 – paragraph 4 – point e
Article 10 – paragraph 4 – point e
(e) inform the Coordinating Authority and competent Data Protection Authority, at the latest one month before the start date specified in the detection order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3);
Amendment 1187 #
Proposal for a regulation
Article 10 – paragraph 4 – point e a (new)
Article 10 – paragraph 4 – point e a (new)
(ea) request in respect of any specific technology used for the purpose set out in this Article, a prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679, and request a prior consultation procedure as referred to in Article 36 of that Regulation have been conducted;
Amendment 1190 #
Proposal for a regulation
Article 10 – paragraph 4 a (new)
Article 10 – paragraph 4 a (new)
4a. in respect of any specific technology used for the purpose set out in this Article, conduct a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation;
Amendment 1192 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point a
Article 10 – paragraph 5 – subparagraph 1 – point a
(a) the fact that it operates technologies to detect onlineknown child sexual abuse material to execute the detection orderwarrant, the ways in which it operates those technologies and the impact on the users’ fundamental rights to private and family life, including the confidentiality of users’ communications and the protection of personal data;
Amendment 1215 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of number- independent interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potentiallleged online child sexual abuse on its services, it shall promptly report, without delay, that abuse to the competent law enforcement and independent judicial authorities and submit a report thereon to the EU Centre in accordance with Article 13. It shall do so through the system established in accordance with Article 39(2).
Amendment 1243 #
Proposal for a regulation
Article 13 – paragraph 1 – point c a (new)
Article 13 – paragraph 1 – point c a (new)
(ca) where applicable, an exact uniform resource locator and, where necessary, additional information for the identification of the child sexual abuse material;
Amendment 1246 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
Amendment 1253 #
Proposal for a regulation
Article 13 – paragraph 1 – point f
Article 13 – paragraph 1 – point f
Amendment 1273 #
Proposal for a regulation
Article 14 – paragraph 2
Article 14 – paragraph 2
2. The provider shall execute the removal order as soon as possible and in any event within 24 hours of receipt thereof. For micro, small and medium enterprises, including open source providers, the removal order shall allow additional time, proportionate to the size and the resources of the provider.
Amendment 1286 #
1. Providers of hosting services that have received a removal order issued in accordance with Article 14, as well as the users who provided the material, shall have the right to an effective redress. That right shall include the right to challenge such a removal order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the removal order.
Amendment 1293 #
Proposal for a regulation
Chapter II – Section 5
Chapter II – Section 5
Amendment 1297 #
Proposal for a regulation
Article 16
Article 16
Amendment 1312 #
Proposal for a regulation
Article 17
Article 17
Amendment 1321 #
Proposal for a regulation
Article 18
Article 18
Amendment 1359 #
Proposal for a regulation
Article 21 – paragraph 1
Article 21 – paragraph 1
1. Providers of hostingrelevant information society services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 1367 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
Amendment 1374 #
Proposal for a regulation
Article 21 a (new)
Article 21 a (new)
Article21a Right to lodge a complaint with a supervisory authority 1. Without prejudice to any other administrative or judicial remedy, every user shall have the right to lodge a complaint with a supervisory authority, in particular in the Member State of his or her habitual residence, place of work or place of the alleged infringement if the user considers that the processing of personal data relating to him or her infringes this Regulation or Regulation (EU) 2016/679. 2. The supervisory authority with which the complaint has been lodged shall inform the complainant on the progress and the outcome of the complaint including the possibility of a judicial remedy pursuant to Article 21b.
Amendment 1375 #
Proposal for a regulation
Article 21 b (new)
Article 21 b (new)
Article21b Right to an effective judicial remedy against a provider of a hosting services or a providers of a number-independent interpersonal communications service 1. Without prejudice to any available administrative or non-judicial remedy, including the right to lodge a complaint with a supervisory authority pursuant to 21a, each user shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation or Regulation (EU) 2016/679. 2. Proceedings against a provider of a hosting service or a provider of a number- independent interpersonal communications service shall be brought before the courts of the Member State where the provider has an establishment. Alternatively, such proceedings may be brought before the courts of the Member State where the user has his or her habitual residence.
Amendment 1377 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Article 22 – paragraph 1 – subparagraph 1 – introductory part
Providers of hosting services and providers of number-independent interpersonal communications services shall preserve the content data and other data processed in connection to the measures taken to comply with this Regulation and the personal data generated through such processing, only for one or more of the following purposes, as applicable:
Amendment 1384 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 2
Article 22 – paragraph 1 – subparagraph 2
Amendment 1478 #
Proposal for a regulation
Article 35 – paragraph 4 a (new)
Article 35 – paragraph 4 a (new)
4a. Member States shall ensure that penalties imposed for the infringement of this Regulation do not encourage the over reporting or the removal of material which does not constitute child sexual abuse material.
Amendment 1479 #
Proposal for a regulation
Article 35 a (new)
Article 35 a (new)
Article35a Compensation Users and any body, organisation or association mandated to exercise the rights conferred by this Regulation on their behalf shall have the right to seek, in accordance with Union and national law, compensation from providers of relevant information society services, for any damage or loss suffered due to an infringement by those providers of their obligations under this Regulation.
Amendment 1514 #
Proposal for a regulation
Article 38 – paragraph 2 a (new)
Article 38 – paragraph 2 a (new)
2a. Coordinating Authorities shall increase public awareness regarding the nature of the problem of online child sexual abuse material, how to seek assistance, and how to work with providers of relevant information society services to remove content and coordinate victim identification efforts undertaken in collaboration with existing victim identification programmes.
Amendment 1525 #
Proposal for a regulation
Article 39 – paragraph 3 a (new)
Article 39 – paragraph 3 a (new)
Amendment 1577 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) providing technical expertise and promoting the exchange of best practices among Member States on raising awareness for the prevention of child sexual abuse online in formal and non- formal education. Such efforts shall be age-appropriate and gender-sensitive;
Amendment 1582 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b b (new)
Article 43 – paragraph 1 – point 6 – point b b (new)
(bb) exchanging best practices among Coordinating Authorities regarding the available tools to reduce the risk of children becoming victims of sexual abuse and to provide specialized assistance to survivors, in an age-appropriate and gender-sensitive way.
Amendment 1603 #
Proposal for a regulation
Article 44 – paragraph 1 – point b
Article 44 – paragraph 1 – point b
Amendment 1606 #
Proposal for a regulation
Article 44 – paragraph 1 – point c
Article 44 – paragraph 1 – point c
Amendment 1716 #
Proposal for a regulation
Article 50 – paragraph 5
Article 50 – paragraph 5
5. The EU Centre shall develop a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of online child sexual abuse and measures to prevent and combat such abuse. Communication campaigns shall be easily understandable and accessible to all children, their families and educators in formal, and non-formal education in the Union, aiming to improve digital literacy and ensure a safe digital environment for children. Communication campaigns shall take into account the gender dimension of the crime.
Amendment 1742 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 1
Article 53 – paragraph 2 – subparagraph 1
Amendment 1745 #
Proposal for a regulation
Article 53 – paragraph 2 – subparagraph 2
Article 53 – paragraph 2 – subparagraph 2
Amendment 1753 #
Proposal for a regulation
Article 53 – paragraph 3
Article 53 – paragraph 3
3. The terms of cooperation and working arrangements shall be laid down in a publically accessible memorandum of understanding.
Amendment 1762 #
Proposal for a regulation
Article 55 – paragraph 1 – introductory part
Article 55 – paragraph 1 – introductory part
The administrative and management structure of the EU Centre shall be gender- balanced and comprise:
Amendment 1764 #
Proposal for a regulation
Article 55 – paragraph 1 – point d a (new)
Article 55 – paragraph 1 – point d a (new)
(da) a Fundamental Rights Officer, which shall exercise the tasks set out in Article 66b;
Amendment 1765 #
Proposal for a regulation
Article 55 – paragraph 1 – point d b (new)
Article 55 – paragraph 1 – point d b (new)
(db) an Expert's Consultative Forum, which shall exercise the tasks set out in Article 66a;
Amendment 1767 #
Proposal for a regulation
Article 56 – paragraph 1
Article 56 – paragraph 1
1. The Management Board shall be gender-balanced and composed of one representative from each Member State and two representatives of the Commission, all as members with voting rights.
Amendment 1780 #
Proposal for a regulation
Article 57 – paragraph 1 – point f a (new)
Article 57 – paragraph 1 – point f a (new)
(fa) appoint a Data Protection Officer;
Amendment 1781 #
Proposal for a regulation
Article 57 – paragraph 1 – point f b (new)
Article 57 – paragraph 1 – point f b (new)
(fb) appoint a Fundamental Rights Officer;
Amendment 1806 #
Proposal for a regulation
Article 66 a (new)
Article 66 a (new)
Amendment 1807 #
Proposal for a regulation
Chapter IV – Section 5 – Part 3 a (new)
Chapter IV – Section 5 – Part 3 a (new)
Amendment 1888 #
Proposal for a regulation
Annex I – title
Annex I – title
DETECTION ORDERWARRANT ISSUED IN ACCORDANCE WITH REGULATION (EU) …/… LAYING DOWN RULES TO PREVENT AND COMBAT CHILD SEXUAL ABUSE (‘THE REGULATION’)
Amendment 1889 #
Proposal for a regulation
Annex I – Section 1 – paragraph 2 – introductory part
Annex I – Section 1 – paragraph 2 – introductory part
Name of the competent judicial authority or the independent administrative authority having issued the detection orderwarrant:
Amendment 1890 #
Proposal for a regulation
Annex I – Section 4 – paragraph 2 – point 2
Annex I – Section 4 – paragraph 2 – point 2
Amendment 1893 #
Proposal for a regulation
Annex I – Section 4 – paragraph 2 – point 3
Annex I – Section 4 – paragraph 2 – point 3
Amendment 1895 #
Proposal for a regulation
Annex II – title
Annex II – title
TEMPLATE FOR INFORMATION ABOUT THE IMPOSSIBILITY TO EXECUTE THE DETECTION ORDERWARRANT referred to in Article 8(3) of Regulation (EU) .../… [laying down rules to prevent and combat child sexual abuse]
Amendment 1898 #
Proposal for a regulation
Annex III – Section 2 – point 2 – point 2
Annex III – Section 2 – point 2 – point 2
Amendment 1902 #
Proposal for a regulation
Annex III – Section 2 – point 3 – introductory part
Annex III – Section 2 – point 3 – introductory part
3) Content data related to the reported potential online child sexual abuse, including images, and videos and texts, as applicable:
Amendment 1903 #
Proposal for a regulation
Annex III – Section 2 – point 4
Annex III – Section 2 – point 4
Amendment 1907 #
Proposal for a regulation
Annex VII
Annex VII
Amendment 1909 #
Proposal for a regulation
Annex VIII
Annex VIII