41 Amendments of Peter POLLÁK related to 2022/0155(COD)
Amendment 26 #
Proposal for a regulation
Recital 1
Recital 1
(1) Information society services have become very important for communication, expression, gathering of information and many other aspects of present-day life, including for children but also for perpetrators of child sexual abuse offences. Digital services have become an irreplaceable tool for today’s children, as information, elements of formal education, social contact and entertainment are increasingly online; whereas digital services can also expose children to risks such as unsuitable content, grooming, and child sexual abuse. Such offences, which are subject to minimum rules set at Union level, are very serious criminal offences that need to be prevented and combated effectively in order to protect children’s rights and well- being, as is required under the Charter of Fundamental Rights of the European Union (‘Charter’), and to protect society at large. Users of such services offered in the Union should be able to trust that the services concerned can be used safely, especially by children. In order to ensure a safer online experience for children and prevent the above-mentioned offences, digital literacy should be recognized as a mandatory skill by Member States and should be included in the school curriculum across the EU.
Amendment 28 #
Proposal for a regulation
Recital 2
Recital 2
(2) Given the central importance of relevant information society services, those aims can only be achieved by appropriate prevention techniques, improving digital literacy, and ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. Tcombat such abuse. In order to alleviate the burden on providers, the measures should also aim to raise awareness amongst parents and children and further develop their digital skills, so that they can detect suspicious behaviours online. Once a child sexual abuse material (CSAM) is reported, the measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services.
Amendment 29 #
Proposal for a regulation
Recital 2
Recital 2
(2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, so as to avoid any undue negative consequences for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid imposing any excessive burdens on the providers of the services. To this end, fundamental importance should be attached to ensuring the necessary funding to European programmes and projects which aim to improve digital skills and awareness of risk linked to the digital world, such as “Media literacy for all”.
Amendment 32 #
Proposal for a regulation
Recital 3
Recital 3
(3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements and appropriate prevention techniques should be laid down at Union level.
Amendment 34 #
Proposal for a regulation
Recital 4 a (new)
Recital 4 a (new)
(4 a) To insure full application of the objectives of this Regulation, Member States shall implement prevention strategies and awareness campaigns in their school curriculum and inside educational institutions. Taking into account the data collected by the EU Centre, Coordinating Authorities, relevant law enforcement agencies and existing hotlines across the EU, Member States should elaborate prevention techniques improving digital literacy, by educating children on how to safely surf online and how to recognize signals of cyber grooming. Prevention techniques and awareness campaigns should also target parents. Parents and caregivers shall be informed of the existence and the functioning of digital tools to limit and direct their child’s/children’s experience online and limit access to age- inappropriate or harmful content online.
Amendment 39 #
Proposal for a regulation
Recital 12
Recital 12
(12) For reasons of consistency and technological neutrality, the term ‘child sexual abuse material’ should for the purpose of this Regulation be defined as referring to any type of material constituting child pornography or pornographic performance within the meaning of Directive 2011/93/EU, which is capable of being disseminated through the use of hosting or interpersonal communication services. At present, such material typically consists of images or videos, without it however being excluded that it takes other forms, especially in view of future technological developments. Close attention should be paid to the development of new technologies and platforms, such as the metaverse. In such platforms child sexual abuse material might be generated and exchanged or child sexual abuse perpetrated through the use of avatars or any other form of virtual identities.
Amendment 46 #
Proposal for a regulation
Recital 35
Recital 35
(35) The dissemination of child sexual abuse material is a criminal offence that affects the rights of the victims depicted. Victims should thereforehave the right to be forgotten, i.e. the right to request the deletion of child sexual abuse material depicting them. Victims should also have the right to obtain, upon request, from the EU Centre yet via the Coordinating Authorities, relevant information if known child sexual abuse material depicting them is reported by providers of hosting services or providers of publicly available interpersonal communications services in accordance with this Regulation.
Amendment 60 #
Proposal for a regulation
Recital 58
Recital 58
(58) In particular, in order to facilitate the cooperation needed for the proper functioning of the mechanisms set up by this Regulation, the EU Centre should establish and maintain the necessary information-sharing systems. When establishing and maintaining such systems, the EU Centre should cooperate with the European Union Agency for Law Enforcement Cooperation (‘Europol’), national hotlines and national authorities to build on existing systems and best practices, where relevant.
Amendment 63 #
Proposal for a regulation
Recital 60
Recital 60
(60) In the interest of legal certainty and effectiveness, the tasks of the EU Centre should be listed in a clear and comprehensive manner. With a view to ensuring the proper implementation of this Regulation, those tasks should relate in particular to the facilitation of the detection, reporting and blocking obligations imposed on providers of hosting services, providers of publicly available interpersonal communications services and providers of internet access services. However, for that same reason, the EU Centre should also be charged with certain other tasks, notably those relating to the implementation of the risk assessment and mitigation obligations of providers of relevant information society services, the removal of or disabling of access to child sexual abuse material by providers of hosting services, the provision of assistance to Coordinating Authorities, as well as the creation of prevention strategies, generation and sharing of knowledge and expertise related to online child sexual abuse.
Amendment 72 #
Proposal for a regulation
Recital 67
Recital 67
(67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. The EU center should contribute to the creation of adequate prevention strategies and awareness campaigns on online grooming and dissemination of CSAM, targeting children, parents and educators across the Union. In this connection, the EU Centre should cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned.
Amendment 78 #
Proposal for a regulation
Recital 70
Recital 70
(70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Their experience and expertise shall help the EU Centre and Coordinating Authorities to design appropriate prevention techniques and awareness campaigns on online grooming and dissemination of CSAM online.
Amendment 82 #
Proposal for a regulation
Recital 73
Recital 73
(73) To ensure its proper functioning, the necessary rules should be laid down regarding the EU Centre’s organisation. In the interest of consistency, those rules should be in line with the Common Approach of the European Parliament, the Council and the Commission on decentralised agencies. In order to complete its tasks, the EU Centre and Coordinating authorities should have the necessary funds, human resources, investigative powers and technical capabilities to seriously and effectively pursue and investigate complaints and potential offenders, including appropriate training to build capacity in the judiciary and police units and to develop new high- tech capabilities to address the challenges of analysing vast amounts of child abuse imagery, including material hidden on the ‘dark web’.
Amendment 83 #
Proposal for a regulation
Recital 74
Recital 74
(74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should stay consistently updated on technological developments that might lead to the creation of different or unconventional platforms, such as the metaverse, on which child sexual abuse might be perpetrated or child sexual abuse material be generated or exchanged; it should therefore have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to detection of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology.
Amendment 85 #
Proposal for a regulation
Recital 74
Recital 74
(74) In view of the need for technical expertise in order to perform its tasks, in particular the task of providing a list of technologies that can be used for detection, the EU Centre should have a Technology Committee composed of experts with advisory function. The Technology Committee may, in particular, provide expertise to support the work of the EU Centre, within the scope of its mandate, with respect to matters related to prevention and detection of online child sexual abuse, to support the EU Centre in contributing to a high level of technical standards and safeguards in detection technology.
Amendment 87 #
Proposal for a regulation
Recital 76
Recital 76
(76) In the interest of good governance and drawing on the statistics and information gathered and transparency reporting mechanisms provided for in this Regulation, the Commission should carry out an evaluation of this Regulation within fivthree years of the date of its entry into force, and every fivthree years thereafter.
Amendment 89 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point e a (new)
Article 1 – paragraph 1 – subparagraph 2 – point e a (new)
(e a) Guidelines on creation of appropriate prevention techniques on cyber grooming and the dissemination of CSAM online, targeting children and parents and empowering them to use digital technologies safely and responsibly.
Amendment 97 #
Proposal for a regulation
Article 3 – paragraph 6 a (new)
Article 3 – paragraph 6 a (new)
6 a. The EU Centre should use these risk assessment reports to prepare and adapt prevention techniques to the attention of Coordinating Authorities across the EU.
Amendment 112 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 2
Article 25 – paragraph 2 – subparagraph 2
The Coordinating Authority shall be responsible for all matters related to application and enforcement of this Regulation in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities. The Coordinating Authority shall also be responsible for the coordination and adaptation of prevention techniques, elaborated by the EU Centre. The Coordinating Authority shall generate recommendations and good practices on improving digital literacy and skills amongst the population trough the realization of awareness campaigns on a national level, targeting in particular parents and children on the detection and prevention of child sexual abuse online.
Amendment 118 #
Proposal for a regulation
Article 25 – paragraph 7 – point d a (new)
Article 25 – paragraph 7 – point d a (new)
(d a) provide knowledge and experience on appropriate prevention techniques on grooming and the detection and dissemination of CSAM online;
Amendment 121 #
Proposal for a regulation
Article 26 – paragraph 4
Article 26 – paragraph 4
4. The Coordinating Authorities shall ensure that relevant members of staff have the required qualifications, experience and technical skills to perform their dutin the area of combatting online child sexual abuse. Members of staff shall be offered appropriate trainings in order to continuously improve their understanding of the constantly evolving digital technologies.
Amendment 122 #
Proposal for a regulation
Article 34 – paragraph 2
Article 34 – paragraph 2
2. Coordinating Authorities shall also provide child-friendly mechanisms to submit a complaint under this Article and adopt a child-sensitive approach when handling complaints submitted by children, taking due account of theren with the necessary tools to recognize suspicious behavior and potentially dangerous content online and easily submit a complaint under this Article. Coordinating Authorities shall examine every complaint and adopt a child-sensitive approach taking into account the specificities of all elements of the complaint (website or interpersonal communication service, child'’s age, maturity, views, needs andspecific concerns).
Amendment 123 #
Proposal for a regulation
Article 39 – paragraph 1
Article 39 – paragraph 1
1. Coordinating Authorities shall cooperate with each other, with national hotlines and any other competent authorities of the Member State that designated the Coordinating Authority, the Commission, the EU Centre and other relevant Union agencies, including Europol, to facilitate the performance of their respective tasks under this Regulation and ensure its effective, efficient and consistent application and enforcement. Coordinating Authorities shall exchange information and best practices on preventing and combatting grooming and child sexual abuse online.
Amendment 126 #
Proposal for a regulation
Article 40 – paragraph 2 a (new)
Article 40 – paragraph 2 a (new)
2 a. The EU Centre shall elaborate appropriate prevention techniques on grooming and child sexual abuse online, based on its knowledge, expertise and achievements, in close cooperation with relevant stakeholders and in line with the Communication of the Commission of 11 May “A Digital Decade for children and youth: the new European strategy for a better internet for kids" (BIK+).
Amendment 140 #
Proposal for a regulation
Article 50 – paragraph 3
Article 50 – paragraph 3
3. Where necessary for the performance of its tasks under this Regulation, the EU Centre shall carry out, participate in or encourage research, surveys and studies, either on its own initiative or, where appropriate and compatible with its priorities and its annual work programme, at the request of the European Parliament, the Council or the Commission. The collected knowledge (resulting from research, surveys and studies) shall serve as a tool to elaborate prevention techniques on child sexual abuse online to be adapted and implemented by Coordinating Authorities in each Member State.
Amendment 142 #
Proposal for a regulation
Article 50 – paragraph 4
Article 50 – paragraph 4
4. The EU Centre shall provide the information referred to in paragraph 2 and the information resulting from the research, surveys and studies referred to in paragraph 3, including its analysis thereof, and its opinions on matters related to the prevention and combating of online child sexual abuse to other Union institutions, bodies, offices and agencies, Coordinating Authorities, Hotlines, other competent authorities and other public authorities of the Member States, either on its own initiative or at request of the relevant authority. Where appropriate, the EU Centre shall make such information publicly available.
Amendment 143 #
Proposal for a regulation
Article 50 – paragraph 5
Article 50 – paragraph 5
5. The EU Centre shall develop prevention techniques on the detection of suspicious content and behavior online and shall communicate it to Coordinating Authorities of each Member State, so they could adapt and initiate measures to improve digital literacy and raise awareness amongst parents and educators of the existing digital tools to insure a safe digital environment for children. The EU Centre shall also establish a communication strategy and promote dialogue with civil society organisations and providers of hosting or interpersonal communication services to raise public awareness of and improve and conltine child sexual abuse and measures to prevent and combat suchuously adapt prevention techniques on grooming and online child sexual abuse.
Amendment 308 #
Proposal for a regulation
Recital 5
Recital 5
(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those service as publicly available. As services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this Regulation. Online search engines and other artificial intelligence services should also be covered. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate mannerand targeted manner. Considering the fundamental importance of the right to respect for private life and the right to protection of personal data, as guaranteed by the Charter of Fundamental Rights, nothing in this regulation should be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications.
Amendment 333 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available interpersonal communications services should take effective and reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures2022/2065 may consider to which extent mitigation measures adopted to comply with that obligation. Mitigation measures necessary for the fulfilment of the obligations in this regulation may include the design of online interfaces or parts thereof with the highest level of privacy, safety and security for children by default, the adoapted to comply with that obligation, which may includeation of standards for protection of children, participation in codes of conduct for protecting children, targeted measures to protect the rights of the child, including age verification and-appropriate parental control tools, may also. Enabling flagging and/or notifying mechanisms and self-reporting functionalities, where possible with the use of AI, shall serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 353 #
Proposal for a regulation
Recital 20
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficientthe provider refuses to cooperate by putting in place the mitigating measures aimed to limit the risk of misuse of a certain service for the purpose of online child sexual abuse, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request, as a measure of last resort, the issuance of detection orders. In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available interpersonal communications services, and that solicitation of children mostly takes place in publicly available interpersonal communications services, it should only be possible to address detection orders to providers of such services. Such detection orders shall be issued with regards to the technical capacity of the provider, and shall in no way be intrepreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encrypted content and communications.
Amendment 373 #
Proposal for a regulation
Recital 23
Recital 23
(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detection order is to be issued, it should still be ensured that the detection order is targeted and specifiedjustified, proportionate and related only to an identifiable part of the specific service, user or group of users, as well as targeted and limited in time so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
Amendment 383 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available interpersonal communications services to execute detection orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use ofIn accordance with Article 6a, nothing in this regulation shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encryptied con technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of childrennt or communications through client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provides third party actors with access to the end-to-end encrypted content and communications. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
Amendment 389 #
Proposal for a regulation
Recital 26 a (new)
Recital 26 a (new)
(26a) End-to-end encryption is an essential tool to guarantee the security, privacy and confidentiality of the communications between users, including those of children. Any weakening of the end-to-end encryption's effect could potentially be abused by malicious third parties. Nothing in this Regulation should therefore be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communications shall be understood the processing of any data, that would compromise or put at risk the integrity and confidentiality of the aforementioned end-to-end encrypted content. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side-channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communication services provide third party actors access to the end-to-end encrypted content and communications.
Amendment 651 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 a (new)
Article 3 – paragraph 2 – point b – indent 4 a (new)
- functionalities enabling age- appropriate parental controls, including with the use of AI;
Amendment 653 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 4 b (new)
Article 3 – paragraph 2 – point b – indent 4 b (new)
- functionalities enabling self- reporting, including with the use of AI;
Amendment 695 #
Proposal for a regulation
Article 3 – paragraph 2 a (new)
Article 3 – paragraph 2 a (new)
2a. The provider, where applicable, shall assess, in a separate section of its risk assessment, the voluntary use of specific technologies for the processing of personal and other data to the extent strictly necessary to detect, to report and to remove online child sexual abuse material from its services. Such voluntary use of specific technologies shall under no circumstances undermine the integrity and confidentiality of end-to-end encrypted content and communcations.
Amendment 862 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
Article 6 – paragraph 1 – point b
(b) take reasonable measures to prevent child users from accessing the software applications in relation to which they have identified a significant risk of use of the service concerned for the purpose of the solicitation of children; or where:
Amendment 864 #
Proposal for a regulation
Article 6 – paragraph 1 – point b – point i (new)
Article 6 – paragraph 1 – point b – point i (new)
i) the developer of the software application has decided and informed the software application store that its terms and conditions of use do not permit child users,
Amendment 865 #
Proposal for a regulation
Article 6 – paragraph 1 – point b – point ii (new)
Article 6 – paragraph 1 – point b – point ii (new)
ii) the software application has an appropriate age rating model in place, or
Amendment 866 #
Proposal for a regulation
Article 6 – paragraph 1 – point b – point iii (new)
Article 6 – paragraph 1 – point b – point iii (new)
iii) the developer of the software application has requested the software application store not to allow child users to download its software applications.
Amendment 875 #
Proposal for a regulation
Article 6 a (new)
Article 6 a (new)
Article6a End-to-end encrypted services Nothing in this Regulation shall be interpreted as prohibiting or compromising the integrity and confidentiality of end-to-end encrypted content and communications. As compromising the integrity of end-to-end encrypted content and communcations shall be understood the processing of any data that would compromise or put at risk the integrity and confidentiality of the content and communications in the end- to-end encryption. Nothing in this regulation shall thus be interpreted as justifying client-side scanning with side- channel leaks or other measures by which the provider of a hosting service or a provider of interpersonal communications services provides third party actors access to the end-to-end encrypted content.
Amendment 1017 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the issuance of detection orders, and the competent judicial or independent administrative authority when issuing the detection order, shall, in accordance with Article 8 of Regulation (EU) 2022/2065, target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b),2 remain limited to what is strictly necessary, justifiable and proportionate to effectively address the significant risk referred to in point (a) thereof, and limit the detection order to an identifiable part or component of a service, such as a specific channel of communication or a specific group of users identified with particularity for which the significant risk has been identified. In accordance with Article 6a, no such detection order shall be interpreted as prohibiting, or compromising the integrity and confidentiality of, end-to-end encrypted content and communications.