37 Amendments of Theresa BIELOWSKI related to 2022/0155(COD)
Amendment 532 #
Proposal for a regulation
Article 1 – paragraph 3 a (new)
Article 1 – paragraph 3 a (new)
3a. This Regulation shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption, or be interpreted in that way.
Amendment 534 #
Proposal for a regulation
Article 1 – paragraph 3 b (new)
Article 1 – paragraph 3 b (new)
3b. This Regulation shall not undermine the prohibition of general monitoring under Union law or introduce general data retention obligations, or be interpreted in that way.
Amendment 608 #
Proposal for a regulation
Article -3 (new)
Article -3 (new)
Article-3 Protection of fundamental human rights and confidentiality in communications 1. Nothing in this Regulation shall prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way. 2. Nothing in this Regulation shall undermine the prohibition of general monitoring under Union law or introduce general data retention obligations.
Amendment 610 #
Proposal for a regulation
Article 3 – paragraph 1
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such any serious systemic risk stemming from the functioning and use of their services for the purpose of online child sexual abuse. That risk assessment shall be specific to the services that they offer,ey offer and proportionate to the serious systemic risk considering its severity and probability. To this end, providers subject to an obligation to conduct a risk assessment under Regulation (EU) 2022/2065 may draw on that risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse.
Amendment 618 #
Proposal for a regulation
Article 3 – paragraph 1 a (new)
Article 3 – paragraph 1 a (new)
1a. Without prejudice to Regulation (EU) 2022/2065, when conducting the risk assessment, providers of hosting services and providers of interpersonal communications services shall respect and avoid any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity, respect for private and family life, the protection of personal data, freedom of expression and information, including the freedom and pluralism of the media, the prohibition of discrimination, the rights of the child and consumer protection, as enshrined in Articles 1, 7, 8, 11, 21, 24 and 38 of the Charter respectively.
Amendment 622 #
Proposal for a regulation
Article 3 – paragraph 2 – point a
Article 3 – paragraph 2 – point a
(a) any previouslyserious systemic risks and identified instances of use of its services for the purpose of online child sexual abuse;
Amendment 636 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
Article 3 – paragraph 2 – point b – indent 3
Amendment 726 #
Proposal for a regulation
Article 4 – paragraph -1 (new)
Article 4 – paragraph -1 (new)
-1. Providers of hosting services and providers of interpersonal communications services shall have mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be online child sexual abuse.This obligation shall not be interpreted as an obligation of general monitoring or generalised data retention. Such mechanisms shall be easy to access, child-friendly, and shall allow for the submission of notices by electronic means. [By 6 months after entry into force] the Commission shall adopt a delegated act laying down design requirements for a uniform identifiable notification mechanism as referred to in this Article, including on the design of a uniform, easily recognisable, icon in the user interface. Providers of hosting services and providers of interpersonal communications services targeting children may implement the design requirements specified in the delegated act referred to in this paragraph.
Amendment 731 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measuresput in place reasonable, proportionate and targeted mitigation measures, tailored to their services and the serious systemic risk identified pursuant to Article 3, with the aim of mitigating that risk. Such measures shall never entail a general monitoring obligation or generalised data retention obligation and shall include some or all of the following:
Amendment 736 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
Article 4 – paragraph 1 – point a
(a) testing and adapting, through state of the art appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditions, including the speed and quality of processing notices and reports related to online child sexual abuse and, where appropriate, the expeditious removal of the content notified;
Amendment 739 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
Article 4 – paragraph 1 – point a a (new)
(aa) adapting the design, features and functions of their services in order to ensure a high level of privacy, data protection, safety, and security by design and by default, including some or all of the following: (a) limiting users, by default, to establish direct contact with other users, in particular through private communications; (b) limiting users, by default, to directly share images or videos on services; (c) limiting users, by default, to directly share personal contact details with other users, such as phone numbers, home addresses and e-mail addresses, via rules- based matching; (d) limiting users, by default, to create screenshots or recordings within the service; (e) limiting users, by default, to directly reforward images and videos to other users where no consent has been given; (f) allowing parents of a child or a legal representative of a child to make use of meaningful parental controls tools, which protect the confidentiallity of communications of the child; (g) encouraging children, prior to registring for the service, to talk to their parents about how the service works and what parental controls tools are available. Services taking the measures outlined in this point may allow users to revert such measures on an individual level.
Amendment 763 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communicationrelevant information society services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
Amendment 767 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
Article 4 – paragraph 1 – point c a (new)
(ca) reinforcing awareness-raising measures and adapting their online interface for increased user information, including child-appropriate information targeted to the risk identified;
Amendment 772 #
Proposal for a regulation
Article 4 – paragraph 1 – point c b (new)
Article 4 – paragraph 1 – point c b (new)
(cb) including clearly visible and identifiable information on the minimum age for using the service;
Amendment 773 #
Proposal for a regulation
Article 4 – paragraph 1 – point c c (new)
Article 4 – paragraph 1 – point c c (new)
(cc) initiating targeted measures to protect the rights of the child and tools aimed at helping users to indicate child sexual abuse material and helping children to signal abuse or obtain support;
Amendment 777 #
Proposal for a regulation
Article 4 – paragraph 1 a (new)
Article 4 – paragraph 1 a (new)
1a. Providers of hosting services and providers of interpersonal communications services directly targeting children shall implement the design requirements as specified in the delegated act referred to in paragraph -1 and shall take all mitigation measures as outlined in paragraph 1, point (aa), of this Article to minimise this risk. Such services shall allow users to revert mitigation measures on an individual level.
Amendment 886 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power toA competent judicial authority may issue, following a request by the competent judicial aCoordinating Authority of the Member State that designated it or another independent administrative authority of that Member State to issuethe judicial authority, a detection orderwarrant requiring a provider of hosting services or a provider of number-independent interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse onmaterial related to specific terminal equipment or a specific uservice account, where there is a reasonable suspicion such content is stored on that terminal equipment or in that user account.
Amendment 1004 #
Proposal for a regulation
Article 7 – paragraph 6
Article 7 – paragraph 6
Amendment 1010 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1
Article 7 – paragraph 7 – subparagraph 1
Amendment 1046 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3 a (new)
Article 7 – paragraph 9 – subparagraph 3 a (new)
The European Data Protection Board shall also issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the detection of child sexual abuse material in encrypted and non-encrypted environments.Supervisory authorities as referred to in that Regulation shall supervise the application of those guidelines. Prior to the use of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
Amendment 1539 #
Proposal for a regulation
Article 42 – paragraph 1
Article 42 – paragraph 1
The seat of the EU Centre shall be The Hague, The Netherlandchoice of the location of the seat of the Centre shall be made in accordance with the ordinary legislative procedure, based on the following criteria: (a) it shall not affect the Centre’s execution of its tasks and powers, the organisation of its governance structure, the operation of its main organisation, or the main financing of its activities; (b) it shall ensure that the Centre is able to recruit the high-qualified and specialised staff it requires to perform the tasks and exercise the powers provided by this Regulation; (c) it shall ensure that it can be set up on site upon the entry into force of this Regulation; (d) it shall ensure appropriate accessibility of the location, the existence of adequate education facilities for the children of staff members, appropriate access to the labour market, social security and medical care for both children and spouses; (da) it shall ensure a balanced geographical distribution of EU institutions, bodies and agencies across the Union; (db) it shall ensure its national Child Sexual Abuse framework is of a proven quality and repute, and shall benefit from the experience of national authorities; (dc) it shall enable adequate training opportunities for combating child sexual abuse activities; (dd) it shall enable close cooperation with EU institutions, bodies and agencies but it shall be independent of any of the aforementioned; (de) it shall ensure sustainability and digital security and connectivity with regards to physical and IT infrastructure and working conditions.
Amendment 1553 #
Proposal for a regulation
Article 43 – paragraph 1 – point 2
Article 43 – paragraph 1 – point 2
Amendment 1572 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point a
Article 43 – paragraph 1 – point 6 – point a
(a) collecting, recording, analysing and providing gender and age specific information, providing analysis based on anonymised and non-personal data gathering, including gender and age disaggregated data, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51;
Amendment 1575 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b
Article 43 – paragraph 1 – point 6 – point b
(b) supporting the development and dissemination of research and expertise on those matters and on assistance to victimssurvivors, taking into account the gender dimension, including by serving as a hub of expertise to support evidence-based policy;
Amendment 1577 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) providing technical expertise and promoting the exchange of best practices among Member States on raising awareness for the prevention of child sexual abuse online in formal and non- formal education. Such efforts shall be age-appropriate and gender-sensitive;
Amendment 1582 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b b (new)
Article 43 – paragraph 1 – point 6 – point b b (new)
(bb) exchanging best practices among Coordinating Authorities regarding the available tools to reduce the risk of children becoming victims of sexual abuse and to provide specialized assistance to survivors, in an age-appropriate and gender-sensitive way.
Amendment 1585 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b c (new)
Article 43 – paragraph 1 – point 6 – point b c (new)
(bc) referring survivors to appropriate child protection services;
Amendment 1587 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c a (new)
Article 43 – paragraph 1 – point 6 – point c a (new)
Amendment 1588 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c b (new)
Article 43 – paragraph 1 – point 6 – point c b (new)
(cb) create and oversee an "EU hashing list of known child sexual abuse material" and modify the content of that list, independently and autonomously and free of political, government or industry influence or interference;
Amendment 1589 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c c (new)
Article 43 – paragraph 1 – point 6 – point c c (new)
(cc) develop, in accordance with the implementing act as referred to in Article 43a, the European Centralised Helpline for Abuse of Teenagers (eCHAT), interconnecting via effective interoperability the national hotline's helplines, allowing children to reach out 24/7 via a recognisable central helpline in an anonymous way in their own language and free of charge;
Amendment 1590 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c d (new)
Article 43 – paragraph 1 – point 6 – point c d (new)
(cd) dispose over the resources needed to develop, where possible, open source, hashing technology tools for small and medium sized relevant information society services to prevent the dissemination of known child sexual abuse material in publicly accessible content.
Amendment 1591 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c e (new)
Article 43 – paragraph 1 – point 6 – point c e (new)
(ce) coordinate sharing and filter of Suspicious Activity Reports on alleged "known child sexual abuse material", operating independently, autonomously, free of political, government or industry influence or interference and in full respect of fundamental rights, including privacy and data protection. [By 1 year after entry into force] the Commission shall adopt a delegated act laying down requirements for a Suspicious Activy Reports format, as referred to in this paragraph, and the differentiation between actionable and non-actionable Suspicious Activity Reports. This delegated act shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way.
Amendment 1592 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c f (new)
Article 43 – paragraph 1 – point 6 – point c f (new)
(cf) scan public servers and public communications channels for known child sexual abuse material, with proven technology, solely for the purposes of amending the EU Hashing List and flagging the content for removal to the service provider of the specific public server or public communications channel, without prejudice to Art. -3. The European Data Protection Board shall issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the purpose of scanning.
Amendment 1597 #
Proposal for a regulation
Article 43 a (new)
Article 43 a (new)
Article43a Implementing act for the interconnection of helplines 1. The national helpline referred to in Article 43 shall be interconnected via the European Centralised Helpline for Abuse of Teenagers (eCHAT) to be developed and operated by the EU Centre by ... [two years after the date of entry into force of this Regulation] 2. The Commission shall be empowered to adopt, by means of implementing acts, technical specifications and procedures necessary to provide for the interconnection of national hotlines' online chat systems via eCHAT in accordance with Article 43 with regard to: (a) the technical data necessary forthe eCHAT system to perform itsfunctions and the method of storage, useand protection of that technical data; (b) the common criteria according to which national helplines shall be available through the system of interconnection of helplines; (c) the technical details on how helplines shall be madeavailable; (d) the technical conditions of availability of services provided by the system of interconnection of helplines. Those implementing acts shall be adopted in accordance with the examination procedure referred to in Article 5 of Regulation (EU) 182/2011. 3. When adopting the implementingacts referred to in paragraph 2, the Commission shall take into account proven technology and existing practices.
Amendment 1786 #
Proposal for a regulation
Article 61 – paragraph 1 – subparagraph 1
Article 61 – paragraph 1 – subparagraph 1
The Executive Board shall be gender- balanced and composed of the Chairperson and the Deputy Chairperson of the Management Board, two other members appointed by the Management Board from among its members with the right to vote and two representatives of the Commission to the Management Board. The Chairperson of the Management Board shall also be the Chairperson of the Executive Board. The composition of the Executive Board shall take into consideration gender balance with at least 40% is of each sex.
Amendment 1806 #
Proposal for a regulation
Article 66 a (new)
Article 66 a (new)
Amendment 1807 #
Proposal for a regulation
Chapter IV – Section 5 – Part 3 a (new)
Chapter IV – Section 5 – Part 3 a (new)