BETA

37 Amendments of Theresa BIELOWSKI related to 2022/0155(COD)

Amendment 532 #
Proposal for a regulation
Article 1 – paragraph 3 a (new)
3a. This Regulation shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption, or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 534 #
Proposal for a regulation
Article 1 – paragraph 3 b (new)
3b. This Regulation shall not undermine the prohibition of general monitoring under Union law or introduce general data retention obligations, or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 608 #
Proposal for a regulation
Article -3 (new)
Article-3 Protection of fundamental human rights and confidentiality in communications 1. Nothing in this Regulation shall prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way. 2. Nothing in this Regulation shall undermine the prohibition of general monitoring under Union law or introduce general data retention obligations.
2023/07/28
Committee: LIBE
Amendment 610 #
Proposal for a regulation
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such any serious systemic risk stemming from the functioning and use of their services for the purpose of online child sexual abuse. That risk assessment shall be specific to the services that they offer,ey offer and proportionate to the serious systemic risk considering its severity and probability. To this end, providers subject to an obligation to conduct a risk assessment under Regulation (EU) 2022/2065 may draw on that risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse.
2023/07/28
Committee: LIBE
Amendment 618 #
Proposal for a regulation
Article 3 – paragraph 1 a (new)
1a. Without prejudice to Regulation (EU) 2022/2065, when conducting the risk assessment, providers of hosting services and providers of interpersonal communications services shall respect and avoid any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity, respect for private and family life, the protection of personal data, freedom of expression and information, including the freedom and pluralism of the media, the prohibition of discrimination, the rights of the child and consumer protection, as enshrined in Articles 1, 7, 8, 11, 21, 24 and 38 of the Charter respectively.
2023/07/28
Committee: LIBE
Amendment 622 #
Proposal for a regulation
Article 3 – paragraph 2 – point a
(a) any previouslyserious systemic risks and identified instances of use of its services for the purpose of online child sexual abuse;
2023/07/28
Committee: LIBE
Amendment 636 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
– functionalities enabling age verification;deleted
2023/07/28
Committee: LIBE
Amendment 726 #
Proposal for a regulation
Article 4 – paragraph -1 (new)
-1. Providers of hosting services and providers of interpersonal communications services shall have mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be online child sexual abuse.This obligation shall not be interpreted as an obligation of general monitoring or generalised data retention. Such mechanisms shall be easy to access, child-friendly, and shall allow for the submission of notices by electronic means. [By 6 months after entry into force] the Commission shall adopt a delegated act laying down design requirements for a uniform identifiable notification mechanism as referred to in this Article, including on the design of a uniform, easily recognisable, icon in the user interface. Providers of hosting services and providers of interpersonal communications services targeting children may implement the design requirements specified in the delegated act referred to in this paragraph.
2023/07/28
Committee: LIBE
Amendment 731 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of interpersonal communications services shall take reasonable mitigation measures, tailored to the risk identified pursuant to Article 3, to minimise that risk. Such measuresput in place reasonable, proportionate and targeted mitigation measures, tailored to their services and the serious systemic risk identified pursuant to Article 3, with the aim of mitigating that risk. Such measures shall never entail a general monitoring obligation or generalised data retention obligation and shall include some or all of the following:
2023/07/28
Committee: LIBE
Amendment 736 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
(a) testing and adapting, through state of the art appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditions, including the speed and quality of processing notices and reports related to online child sexual abuse and, where appropriate, the expeditious removal of the content notified;
2023/07/28
Committee: LIBE
Amendment 739 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
(aa) adapting the design, features and functions of their services in order to ensure a high level of privacy, data protection, safety, and security by design and by default, including some or all of the following: (a) limiting users, by default, to establish direct contact with other users, in particular through private communications; (b) limiting users, by default, to directly share images or videos on services; (c) limiting users, by default, to directly share personal contact details with other users, such as phone numbers, home addresses and e-mail addresses, via rules- based matching; (d) limiting users, by default, to create screenshots or recordings within the service; (e) limiting users, by default, to directly reforward images and videos to other users where no consent has been given; (f) allowing parents of a child or a legal representative of a child to make use of meaningful parental controls tools, which protect the confidentiallity of communications of the child; (g) encouraging children, prior to registring for the service, to talk to their parents about how the service works and what parental controls tools are available. Services taking the measures outlined in this point may allow users to revert such measures on an individual level.
2023/07/28
Committee: LIBE
Amendment 763 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
(c) initiating or adjusting cooperation, in accordance with competition law, with other providers of hosting services or providers of interpersonal communicationrelevant information society services, public authorities, civil society organisations or, where applicable, entities awarded the status of trusted flaggers in accordance with Article 19 of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] .
2023/07/28
Committee: LIBE
Amendment 767 #
Proposal for a regulation
Article 4 – paragraph 1 – point c a (new)
(ca) reinforcing awareness-raising measures and adapting their online interface for increased user information, including child-appropriate information targeted to the risk identified;
2023/07/28
Committee: LIBE
Amendment 772 #
Proposal for a regulation
Article 4 – paragraph 1 – point c b (new)
(cb) including clearly visible and identifiable information on the minimum age for using the service;
2023/07/28
Committee: LIBE
Amendment 773 #
Proposal for a regulation
Article 4 – paragraph 1 – point c c (new)
(cc) initiating targeted measures to protect the rights of the child and tools aimed at helping users to indicate child sexual abuse material and helping children to signal abuse or obtain support;
2023/07/28
Committee: LIBE
Amendment 777 #
Proposal for a regulation
Article 4 – paragraph 1 a (new)
1a. Providers of hosting services and providers of interpersonal communications services directly targeting children shall implement the design requirements as specified in the delegated act referred to in paragraph -1 and shall take all mitigation measures as outlined in paragraph 1, point (aa), of this Article to minimise this risk. Such services shall allow users to revert mitigation measures on an individual level.
2023/07/28
Committee: LIBE
Amendment 886 #
Proposal for a regulation
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power toA competent judicial authority may issue, following a request by the competent judicial aCoordinating Authority of the Member State that designated it or another independent administrative authority of that Member State to issuethe judicial authority, a detection orderwarrant requiring a provider of hosting services or a provider of number-independent interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse onmaterial related to specific terminal equipment or a specific uservice account, where there is a reasonable suspicion such content is stored on that terminal equipment or in that user account.
2023/07/28
Committee: LIBE
Amendment 1004 #
Proposal for a regulation
Article 7 – paragraph 6
6. As regards detection orders concerning the dissemination of new child sexual abuse material, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the dissemination of new child sexual abuse material; (b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the dissemination of new child sexual abuse material; (c) for services other than those enabling the live transmission of pornographic performances as defined in Article 2, point (e), of Directive 2011/93/EU: (1) a detection order concerning the dissemination of known child sexual abuse material has been issued in respect of the service; (2) the provider submitted a significant number of reports concerning known child sexual abuse material, detected through the measures taken to execute the detection order referred to in point (1), pursuant to Article 12.deleted
2023/07/28
Committee: LIBE
Amendment 1010 #
Proposal for a regulation
Article 7 – paragraph 7 – subparagraph 1
As regards detection orders concerning the solicitation of children, the significant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met: (a) the provider qualifies as a provider of interpersonal communication services; (b) it is likely that, despite any mitigation measures that the provider may have taken or will take, the service is used, to an appreciable extent, for the solicitation of children; (c) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extent, for the solicitation of children.deleted
2023/07/28
Committee: LIBE
Amendment 1046 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3 a (new)
The European Data Protection Board shall also issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the detection of child sexual abuse material in encrypted and non-encrypted environments.Supervisory authorities as referred to in that Regulation shall supervise the application of those guidelines. Prior to the use of any specific technology pursuant to this Article, a mandatory prior data protection impact assessment as referred to in Article 35 of Regulation (EU) 2016/679 and a mandatory prior consultation procedure as referred to in Article 36 of that Regulation must be conducted.
2023/07/28
Committee: LIBE
Amendment 1539 #
Proposal for a regulation
Article 42 – paragraph 1
The seat of the EU Centre shall be The Hague, The Netherlandchoice of the location of the seat of the Centre shall be made in accordance with the ordinary legislative procedure, based on the following criteria: (a) it shall not affect the Centre’s execution of its tasks and powers, the organisation of its governance structure, the operation of its main organisation, or the main financing of its activities; (b) it shall ensure that the Centre is able to recruit the high-qualified and specialised staff it requires to perform the tasks and exercise the powers provided by this Regulation; (c) it shall ensure that it can be set up on site upon the entry into force of this Regulation; (d) it shall ensure appropriate accessibility of the location, the existence of adequate education facilities for the children of staff members, appropriate access to the labour market, social security and medical care for both children and spouses; (da) it shall ensure a balanced geographical distribution of EU institutions, bodies and agencies across the Union; (db) it shall ensure its national Child Sexual Abuse framework is of a proven quality and repute, and shall benefit from the experience of national authorities; (dc) it shall enable adequate training opportunities for combating child sexual abuse activities; (dd) it shall enable close cooperation with EU institutions, bodies and agencies but it shall be independent of any of the aforementioned; (de) it shall ensure sustainability and digital security and connectivity with regards to physical and IT infrastructure and working conditions.
2023/07/28
Committee: LIBE
Amendment 1553 #
Proposal for a regulation
Article 43 – paragraph 1 – point 2
(2) facilitate the detection process referred to in Section 2 of Chapter II, by: (a) providing the opinions on intended detection orders referred to in Article 7(3), first subparagraph, point (d); (b) maintaining and operating the databases of indicators referred to in Article 44; (c) giving providers of hosting services and providers of interpersonal communications services that received a detection order access to the relevant databases of indicators in accordance with Article 46; (d) making technologies available to providers for the execution of detection orders issued to them, in accordance with Article 50(1);deleted
2023/07/28
Committee: LIBE
Amendment 1572 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point a
(a) collecting, recording, analysing and providing gender and age specific information, providing analysis based on anonymised and non-personal data gathering, including gender and age disaggregated data, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51;
2023/07/28
Committee: LIBE
Amendment 1575 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b
(b) supporting the development and dissemination of research and expertise on those matters and on assistance to victimssurvivors, taking into account the gender dimension, including by serving as a hub of expertise to support evidence-based policy;
2023/07/28
Committee: LIBE
Amendment 1577 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b a (new)
(ba) providing technical expertise and promoting the exchange of best practices among Member States on raising awareness for the prevention of child sexual abuse online in formal and non- formal education. Such efforts shall be age-appropriate and gender-sensitive;
2023/07/28
Committee: LIBE
Amendment 1582 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b b (new)
(bb) exchanging best practices among Coordinating Authorities regarding the available tools to reduce the risk of children becoming victims of sexual abuse and to provide specialized assistance to survivors, in an age-appropriate and gender-sensitive way.
2023/07/28
Committee: LIBE
Amendment 1585 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b c (new)
(bc) referring survivors to appropriate child protection services;
2023/07/28
Committee: LIBE
Amendment 1587 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c a (new)
(ca) in its engagement with survivors or in any decision affecting survivors, the EU Centre shall operate in a way that minimises risks to survivors, especially children, addresses harm of survivors and meets their needs in an age-appropriate, and gender- and victim-sensitive manner.
2023/07/28
Committee: LIBE
Amendment 1588 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c b (new)
(cb) create and oversee an "EU hashing list of known child sexual abuse material" and modify the content of that list, independently and autonomously and free of political, government or industry influence or interference;
2023/07/28
Committee: LIBE
Amendment 1589 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c c (new)
(cc) develop, in accordance with the implementing act as referred to in Article 43a, the European Centralised Helpline for Abuse of Teenagers (eCHAT), interconnecting via effective interoperability the national hotline's helplines, allowing children to reach out 24/7 via a recognisable central helpline in an anonymous way in their own language and free of charge;
2023/07/28
Committee: LIBE
Amendment 1590 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c d (new)
(cd) dispose over the resources needed to develop, where possible, open source, hashing technology tools for small and medium sized relevant information society services to prevent the dissemination of known child sexual abuse material in publicly accessible content.
2023/07/28
Committee: LIBE
Amendment 1591 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c e (new)
(ce) coordinate sharing and filter of Suspicious Activity Reports on alleged "known child sexual abuse material", operating independently, autonomously, free of political, government or industry influence or interference and in full respect of fundamental rights, including privacy and data protection. [By 1 year after entry into force] the Commission shall adopt a delegated act laying down requirements for a Suspicious Activy Reports format, as referred to in this paragraph, and the differentiation between actionable and non-actionable Suspicious Activity Reports. This delegated act shall not prohibit, weaken or undermine end-to-end encryption, prohibit providers of information society services from providing their services applying end-to- end encryption or be interpreted in that way.
2023/07/28
Committee: LIBE
Amendment 1592 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point c f (new)
(cf) scan public servers and public communications channels for known child sexual abuse material, with proven technology, solely for the purposes of amending the EU Hashing List and flagging the content for removal to the service provider of the specific public server or public communications channel, without prejudice to Art. -3. The European Data Protection Board shall issue guidelines regarding the compliance with Regulation (EU) 2016/679 of existing and future technologies that are used for the purpose of scanning.
2023/07/28
Committee: LIBE
Amendment 1597 #
Proposal for a regulation
Article 43 a (new)
Article43a Implementing act for the interconnection of helplines 1. The national helpline referred to in Article 43 shall be interconnected via the European Centralised Helpline for Abuse of Teenagers (eCHAT) to be developed and operated by the EU Centre by ... [two years after the date of entry into force of this Regulation] 2. The Commission shall be empowered to adopt, by means of implementing acts, technical specifications and procedures necessary to provide for the interconnection of national hotlines' online chat systems via eCHAT in accordance with Article 43 with regard to: (a) the technical data necessary forthe eCHAT system to perform itsfunctions and the method of storage, useand protection of that technical data; (b) the common criteria according to which national helplines shall be available through the system of interconnection of helplines; (c) the technical details on how helplines shall be madeavailable; (d) the technical conditions of availability of services provided by the system of interconnection of helplines. Those implementing acts shall be adopted in accordance with the examination procedure referred to in Article 5 of Regulation (EU) 182/2011. 3. When adopting the implementingacts referred to in paragraph 2, the Commission shall take into account proven technology and existing practices.
2023/07/28
Committee: LIBE
Amendment 1786 #
Proposal for a regulation
Article 61 – paragraph 1 – subparagraph 1
The Executive Board shall be gender- balanced and composed of the Chairperson and the Deputy Chairperson of the Management Board, two other members appointed by the Management Board from among its members with the right to vote and two representatives of the Commission to the Management Board. The Chairperson of the Management Board shall also be the Chairperson of the Executive Board. The composition of the Executive Board shall take into consideration gender balance with at least 40% is of each sex.
2023/07/28
Committee: LIBE
Amendment 1806 #
Proposal for a regulation
Article 66 a (new)
Article66a Establishment and tasks of the Expert's Consultative Forum 1. The EU Centre shall establish a Consultative Forum to assist it by providing it with independent advice on survivors related matters. The Consultative Forum shall act upon request of the Management Board or the Executive Director. 2. The Consultative Forum shall consist of a maximum of fifteen members. Members of the Consultative Forum shall, in an equal matter, be appointed from child survivors and parents of child survivors, as well as representatives of organizations acting in the public interest, including: (a) organizations representing or promoting rights of the LGBTQIA+ community, specifically minors; (b) organizations representing or promoting children's rights; (b) organizations representing or promoting child survivors rights; (c) organizations representing or promoting digital rights They shall be appointed by the Management Board following the publication of a call for expression of interest in the Official Journal of the European Union. 3. The mandate of members of the Consultative Forum shall be of four years. Those mandates shall be renewable once. 4. The Consultative Forum shall: a) provide the Management Board and the Executive Director with advice on matters related to survivors; b) provide the Management Board, the Executive Director and the Technology Committee with advice on preventive measures for relevant information society services; c) contribute to the EU Centre communication strategy referred to in Article 50(5); d) provide its opinion on the proportionality of technologies used to detect known child sexual abuse; e) maintain an open dialogue with the Management Board and the Executive Director on all matters related to survivors, particularly on the protection of survivors’ rights and digital rights.
2023/07/28
Committee: LIBE
Amendment 1807 #
Proposal for a regulation
Chapter IV – Section 5 – Part 3 a (new)
3a Part 3 a (new): Fundamental Rights Protection Article 66b Fundamental rights officer 1. A fundamental rights officer shall be appointed by the management board on the basis of a list of three candidates, after consultation with the Expert's Consultative Forum. The fundamental rights officer shall have the necessary qualifications, expert knowledge and professional experience in the field of fundamental rights. 2. The fundamental rights officer shall perform the following tasks: (a) contributing to the Centre's fundamental rights strategy and the corresponding action plan, including by issuing recommendations for improving them; (b) monitoring the Centre's compliance with fundamental rights, including by conducting investigations into any of its activities; (c) promoting the Centre's respect of fundamental rights; (d) advising the Centre where he or she deems it necessary or where requested on any activity of the Centre without dagelaying those activities; (e) providing opinions on working arrangements; (f) providing the secretariat of the consultative forum; (g) informing the management board and executive director about possible violations of fundamental rights during activities of the Centre; (h) performing any other tasks, where provided for by this Regulation. 3. The Management Board shall lay down special rules applicable to the fundamental rights officer in order to guarantee that the fundamental rights officer and his or her staff are independent in the performance of their duties. The fundamental rights officer shall report directly to the Management Board and shall cooperate with the Technology Committee. The management board shall ensure that action is taken with regard to recommendations of the fundamental rights officer. In addition, the fundamental rights officer shall publish annual reports on his or her activities and on the extent to which the activities of the Centre respect fundamental rights. Those reports shall include information on the complaints mechanism and the implementation of the fundamental rights strategy. 4. The Centre shall ensure that the fundamental rights officer is able to act autonomously and is able to be independent in the conduct of his or her duties. The fundamental rights officer shall have sufficient and adequate human and financial resources at his or her disposal necessary for the fulfilment of his or her tasks. The fundamental rights officer shall select his or her staff, and that staff shall only report to him or her. 5. The fundamental rights officer shall be assisted by a deputy fundamental rights officer. The deputy fundamental rights officer shall be appointed by the management board from a list of at least three candidates presented by the fundamental rights officer. The deputy fundamental rights officer shall have the necessary qualifications and experience in the field of fundamental rights and shall be independent in the conduct of his or her duties. If the fundamental rights officer is absent or indisposed, the deputy fundamental rights officer shall assume the fundamental rights officer's duties and responsibilities. 6. The fundamental rights officer shall have access to all information concerning respect for fundamental rights in all the activities of the Centre. Article 66c Complaints mechanism 1. The Centre shall, in cooperation with the fundamental rights officer, take the necessary measures to set up and further develop an independent and effective complaints mechanism in accordance with this Article to monitor and ensure respect for fundamental rights in all the activities of the Centre. 2. Any person who is directly affected by the actions or failure to act on the part of staff involved in a joint operation, pilot project, or an operational activity of the Centre, and who considers himself or herself to have been the subject of a breach of his or her fundamental rights due to those actions or that failure to act, or any party representing such a person, may submit a complaint in writing to the Centre. 3. The fundamental rights officer shall be responsible for handling complaints received by the Centre in accordance with the right to good administration. For that purpose, the fundamental rights officer shall review the admissibility of a complaint, register admissible complaints, forward all registered complaints to the executive director and forward complaints concerning members of the teams to the relevant authority or body competent for fundamental rights for further action in accordance with their mandate. The fundamental rights officer shall also register and ensure the follow-up by the Centre or that authority or body. 4. In accordance with the right to good administration, if a complaint is admissible, complainants shall be informed that the complaint has been registered, that an assessment has been initiated and that a response may be expected as soon as it becomes available. If a complaint is forwarded to national authorities or bodies, the complainant shall be provided with their contact details. If a complaint is declared inadmissible, the complainant shall be informed of the reasons and, if possible, provided with further options for addressing their concerns. The Centre shall provide for an appropriate procedure in cases where a complaint is declared inadmissible or unfounded. Any decision shall be in written form and reasoned. The fundamental rights officer shall reassess the complaint if the complainant submits new evidence in situations where the complaint has been declared inadmissible or unfounded. 5. In the case of a registered complaint concerning a staff member of the Centre, the fundamental rights officer shall recommend appropriate follow-up, including disciplinary measures, to the executive director and, where appropriate, referral for the initiation of civil or criminal justice proceedings in accordance with this Regulation and national law. The executive director shall ensure the appropriate follow-up and shall report back to the fundamental rights officer within a determined timeframe and, if necessary, at regular intervals thereafter, as to the findings, the implementation of disciplinary measures, and follow-up by the Centre in response to a complaint. If a complaint is related to data protection issues, the executive director shall consult the data protection officer of the Centre before taking a decision on the complaint. The fundamental rights officer and the data protection officer shall establish, in writing, a memorandum of understanding specifying their division of tasks and cooperation as regards complaints received. 6. The fundamental rights officer shall include information on the complaints mechanism in his or her annual report, as referred to in Article 66a, including specific references to the Centre's findings and the follow-up to complaints. 7. The fundamental rights officer shall, in accordance with paragraphs 1 to 9 and after consulting the experts council, draw up a standardised complaint form requiring detailed and specific information concerning the alleged breach of fundamental rights. The fundamental rights officer shall also draw up any further detailed rules as necessary. The fundamental rights officer shall submit that form and such further detailed rules to the executive director and to the management board. The Centre shall ensure that information about the possibility and procedure for making a complaint is readily available, including for vulnerable persons. The standardised complaint form shall be made available on the Centre's website and in hardcopy during all activities of the Centre in languages that third-country nationals understand or are reasonably believed to understand. The standardised complaint form shall be easily accessible, including on mobile devices. The Centre shall ensure that further guidance and assistance on the complaints procedure is provided to complainants. Complaints shall be considered by the fundamental rights officer even when they have not been submitted in the standardised complaint form. 8. Any personal data contained in a complaint shall be handled and processed by the Centre, including the fundamental rights officer, in accordance with Regulation (EU) 2018/1725. Where a complainant submits a complaint, that complainant shall be understood to consent to the processing of his or her personal data by the Centre and the fundamental rights officer within the meaning of point (d) of Article 5(1) of Regulation (EU) 2018/1725. In order to safeguard the interests of the complainants, complaints shall be dealt with confidentially by the fundamental rights officer in accordance with national and Union law unless the complainant explicitly waives his or her right to confidentiality. When complainants waive their right to confidentiality, it shall be understood that they consent to the fundamental rights officer or the Centre disclosing their identity to the competent authorities or bodies in relation to the matter under complaint, where necessary.
2023/07/28
Committee: LIBE