Activities of Marcel KOLAJA related to 2022/0155(COD)
Shadow opinions (2)
OPINION on the proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse
OPINION on the proposal for a regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse
Amendments (269)
Amendment 47 #
Proposal for a regulation
Recital 35 a (new)
Recital 35 a (new)
Amendment 48 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to prevent children from falling victim of abuse, providers of very large online platforms which have identified the risk of use of their service for the purpose of online child sexual abuse in line with Article 3 should provide reasonable assistance, by putting in place alert and alarm mechanisms in a prominent way on their platforms. The alert mechanism could consist of, for example, linking potential victims to the local organisations such as helplines, victims` right organisations or hotlines. Providers of very large online platforms should ensure adequate follow-up, when a report or alert is made, in the language chosen by the user for using their service. Given the impact on the rights of victims depicted in such known child sexual abuse material and the typical ability of providers of hosting services to limit that impact by helping ensure that the material is no longer available on their services, those providers should assist victims who request the removal or disabling of access of the material in question. That assistance should remain limited to what can reasonably be asked from the provider concerned under the given circumstances, having regard to factors such as the content and scope of the request, the steps needed to locate the items of known child sexual abuse material concerned and the means available to the provider. The assistance could consist, for example, of helping to locate the items, carrying out checks and removing or disabling access to the items. Considering that carrying out the activities needed to obtain such removal or disabling of access can be painful or even traumatic as well as complex, victims should also have the right to be assisted by the EU Centre in this regard, via the Coordinating Authorities. receive adequate psycho-social, age appropriate, child-friendly support and to be assisted by the EU Centre and its relevant partners, such as child helplines or other psycho-social support mechanisms in this regard, via the Coordinating Authorities. Member States should establish and improve the functioning of child helpline and missing children hotline, including through funding and capacity building, in line with Article 96 of Directive (EU) 2018/1972. Victim identification is key not only for tracking down online child sexual abuse, but also for preventing victimisation, stopping the further spread of damaging material and ensuring that victims can benefit from available assistance. However, such victim identification requires a high degree of specialisation and adequate resources. Therefore, the European Cybercrime Centre`s efforts in victim identification should be complemented at national level.
Amendment 53 #
Proposal for a regulation
Recital 44
Recital 44
(44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to the application of this Regulation, without prejudice to the enforcement powers of other national authoritieand related to achieving the objective of this Regulation, including prevention matters without prejudice to the enforcement powers of other national authorities. It is essential to ensure the training of officials who could be in close contact with victims, including law enforcement officers, judges, prosecutors, lawyers and forensic experts and social workers, in order to ensure that such officials are able to understand the problems that child victims can face, and in order to ensure that the situation is prevented and mitigated if necessary. The Coordinating Authority should therefore also act as single point of contact with regard to all matters related to the achievement of the objective of this Regulation, including prevention, with regard to awareness raising and training of officials.
Amendment 54 #
Proposal for a regulation
Recital 45 a (new)
Recital 45 a (new)
(45 a) Given the EU Centre’s particular expertise with regard to the generation and sharing of knowledge, Member States should ensure that such information is shared and promoted at national level. For this purpose, they should cooperate with partner organisations, including with semi-public organisations and hotlines, as well as with civil society. It is important to ensure that practitioners who get in close contact with child victims are adequately trained to deal with such victims, and that the situation of the victim is adequately mitigated. Therefore, the Coordinating authority should ensure that officials such as law enforcement officers, judges, prosecutors, lawyers and forensic experts and social workers cooperate with civil society and semi-public organisations.
Amendment 70 #
Proposal for a regulation
Recital 67
Recital 67
(67) Given its central position resulting from the performance of its primary tasks under this Regulation and the information and expertise it can gather in connection thereto, the EU Centre should also contribute to the achievement of the objectives of this Regulation by serving as a hub for knowledge, expertise and research on matters related to the prevention and combating of online child sexual abuse. In this connection, the EU Centre should, including education and awareness raising, and prevention programmes available for potential offenders and offenders during and after criminal proceedings. The collection and analysis of data should include the list of education and awareness raising material made part of the official curricula. In this connection, the EU Centre should bring together practitioners and researchers. The EU Centre should also cooperate with relevant stakeholders from both within and outside the Union and allow Member States to benefit from the knowledge and expertise gathered, including best practices and lessons learned.
Amendment 76 #
Proposal for a regulation
Recital 70
Recital 70
(70) Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union. Child helplines are equally in the frontline in the fight against online child sexual abuse. Therefore, the EU Centre should also recognise the work of child helplines in victim response, and the existing referral mechanisms between child helplines and hotlines. The EU Centre should coordinate services for victims.
Amendment 94 #
Proposal for a regulation
Article 2 – paragraph 1 – point w a (new)
Article 2 – paragraph 1 – point w a (new)
(w a) `very large online platform` means online platforms which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online platforms pursuant to paragraph 4 of Article 33 of Regulation (EU) 2022/2065;
Amendment 105 #
Proposal for a regulation
Article 21 – title
Article 21 – title
Victims’ right of assistance and support for removal
Amendment 106 #
Proposal for a regulation
Article 21 – paragraph 1
Article 21 – paragraph 1
1. The providers of very large online platforms that have identified the risk of use of their service for the purpose of online child sexual abuse in line with Article 3 shall provide reasonable assistance, on request, to persons residing in the Union that seek to report potential abuse, by putting in place reporting functions in a prominent way on their platform. Such providers shall ensure adequate follow-up, when a report or alert is made, in the language that the user has chosen for their service. Providers of hosting services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 110 #
Proposal for a regulation
Article 21 – paragraph 4 a (new)
Article 21 – paragraph 4 a (new)
4 a. Member States shall establish and improve the functioning of child helpline and missing children hotline, including through funding and capacity building, in line with Article 96 of Directive (EU) 2018/1972.
Amendment 111 #
4 b. Member States shall ensure that law enforcement authorities have adequate technical, financial and human resources to carry out their tasks, including for the purpose of identification of victims.
Amendment 113 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 2
Article 25 – paragraph 2 – subparagraph 2
The Coordinating Authority shall be responsible for all matters related to the application and enforcement of this Regulation, and to the achievement of the objective of this regulation in the Member State concerned, unless that Member State has assigned certain specific tasks or sectors to other competent authorities.
Amendment 114 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 3
Article 25 – paragraph 2 – subparagraph 3
The Coordinating Authority shall in any event be responsible for ensuring coordination at national level in respect of those matters, including matters related to prevention and for contributing to the effective, efficient and consistent application and enforcement of this Regulation throughout the Union.
Amendment 117 #
Proposal for a regulation
Article 25 – paragraph 5
Article 25 – paragraph 5
5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to coordinate prevention within the Member State and to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member State. Member States shall make the information on the contact point publicly available and communicate it to the EU Centre. They shall keep that information updated.
Amendment 119 #
Proposal for a regulation
Article 25 a (new)
Article 25 a (new)
Article 25 a Cooperation with partner organisations Where necessary for the performance of its tasks under this Regulation, including the achievement of the objective of this Regulation, and in order to promote the generation and sharing of knowledge in line with Article 43 (6), the Coordinating Authority shall cooperate with organisations and networks with information and expertise on matters related to the prevention and combating of online child sexual abuse, including civil society organisations and semi-public organisations and practitioners.
Amendment 129 #
(a) collecting, recording, analysing and providing information, providing analysis based on anonymised and non-personal data gathering, and providing expertise on matters regarding the prevention and combating of online child sexual abuse, in accordance with Article 51 , including education and awareness raising programmes, and intervention programmes;
Amendment 131 #
Proposal for a regulation
Article 43 – paragraph 1 – point 6 – point b
Article 43 – paragraph 1 – point 6 – point b
(b) supporting the development and dissemination of research and expertise on those matters and on assistance to victims, including by serving as a hub of expertise to support evidence-based policy and by linking researchers to practitioners;
Amendment 161 #
Proposal for a regulation
Recital 2
Recital 2
(2) Given the central importance of relevant information society services, those aims can only be achieved by ensuring that providers offering such services in the Union behave responsibly and take reasonable measures to minimise the risk of their services being misused for the purpose of child sexual abuse, those providers often being the only ones in a position to prevent and to help combat such abuse. The measures taken should be targeted, carefully balanced and proportionate, effective, evidence- based proportionate , and subject to constant review, so as to avoid any undue negative consequences for the fight against crime and for those who use the services for lawful purposes, in particular for the exercise of their fundamental rights protected under Union law, that is, those enshrined in the Charter and recognised as general principles of Union law, and so as to avoid directly or indirectly imposing any excessive burdens on the providers of the services.
Amendment 164 #
Proposal for a regulation
Recital 3
Recital 3
(3) Member States are increasingly introducing, or are considering introducing, national laws to prevent and combat online child sexual abuse, in particular by imposing requirements on providers of relevant information society services. In the light of the inherently cross-border nature of the internet and the service provision concerned, those national laws, which diverge, may have a direct negative effect on the internal market. To increase legal certainty, eliminate the resulting obstacles to the provision of the services and ensure a level playing field in the internal market, the necessary harmonised requirements should be laid down at Union level.
Amendment 165 #
Proposal for a regulation
Recital 4
Recital 4
(4) Therefore, this Regulation should contribute to the proper functioning of the internal market by setting out clear, uniform and balanced rules to prevent and combat child sexual abuse in a manner that is demonstrably and durably effective and that respects the fundamental rights of all parties concerned. In view of the fast- changing nature of the services concerned and the technologies used to provide them, those rules should be laid down in technology-neutral and future- proof manner, so as not to hamper innovationthe fight against crime .
Amendment 169 #
Proposal for a regulation
Recital 5
Recital 5
(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to bare misused for the purpose of online child sexual abuse. As they are increasingly misused to a significant extent for that purpose, those services shcould include publicly available number- independent interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those services asre publicly available. As serviconline games which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gaming, image-sharing and video-hosting are equallyare also at risk of misuse, they should also be covered by this Regulation. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner.
Amendment 170 #
Proposal for a regulation
Recital 6
Recital 6
(6) Online child sexual abuse frequentlycan also involves the misuse of information society services offered in the Union by providers established in third countries. In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to all providers, irrespective of their place of establishment or residence, that offer services in the Union, as evidenced by a substantial connection to the Union.
Amendment 173 #
Proposal for a regulation
Recital 9
Recital 9
(9) Article 15(1) of Directive 2002/58/EC allows Member States to adopt legislative measures to restrict the scope of the rights and obligations provided for in certain specific provisions of that Directive relating to the confidentiality of communications when such restriction constitutes a necessary, appropriate and proportionate measure within a democratic society, inter alia, to prevent, investigate, detect and prosecute criminal offences, provided certain conditions are met, including compliance with the Charter , which, inter alia, requires the specific measures to be provided for by law and genuinely achieve objectives of general interest. Applying the requirements of that provision by analogy, this Regulation should limit the exercise of the rights and obligations provided for in Articles 5(1), (3) and 6(1) of Directive 2002/58/EC, insofar as strictly necessary in line with Article 52 of the Charter to execute detecinvestigation orders issued in accordance with this Regulation with a view to prevent and combat online child sexual abuse.
Amendment 175 #
Proposal for a regulation
Recital 11
Recital 11
(11) A substantial connection to the Union should be considered to exist where the relevant information society services has an establishment in the Union or, in its absence, on the basis of the existence of a significant number , in relation to population size of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States should be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of a software application in the relevant national software application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1), point (c), of Regulation (EU) 1215/2012 of the European Parliament and of the Council44. Mere technical accessibility of a website from the Union shouldcannot, alone, be considered as establishing a substantial connection to the Union. _________________ 44 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).
Amendment 178 #
Proposal for a regulation
Recital 14
Recital 14
(14) With a view to minimising the risk that their services are misused for the dissemination of known or onlinew child sexual abuse material or the solicitation of children, providers of hosting services and providers of publicly available number- independent interpersonal communications services should assess such risk for eachthat are exposed to substantial amount of child sexual abuse material should assess the existence of a significant risk stemming from the design and functioning of of the services that they offer in the Union. To guide their risk assessment, a non- exhaustive list of elements to be taken into account should be provided. To allow for a full consideration of the specific characteristics of the services they offer, providers should be allowed to take account of additional elements where relevant. As risks evolve over time, in function of developments such as those related to technology and the manners in which the services in question are offered and used, it is appropriate to ensure that the risk assessment is, as well as the effectiveness and proportionality of specific measures are updated regularly and when needed for particular reasons.
Amendment 179 #
Proposal for a regulation
Recital 15
Recital 15
(15) Some of those providers of relevant information society services in scope of this Regulation may also be subject to an obligation to conduct a risk assessment under Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] with respect to information that they store and disseminate to the public , which should form the basis for the risk assessment under this instrument. For the purposes of the present Regulation, those providers may draw on such a risk assessment and complement it with a more specific assessment of the risks of use of their services for the purpose of online child sexual abuse, as required by this Regulation.
Amendment 182 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable specific measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessmentthe dissemination of known child sexual abuse material. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC] may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification and parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 187 #
Proposal for a regulation
Recital 17
Recital 17
(17) To allow for innovation and ensure proportionality and technological neutrality, no exhaustive list of the compulsory mitigationspecific measures should be established. Instead, providers should be left a degree of flexibility to design and implement measures tailored to the risk identifiedexposure and the characteristics of the services they provide and the manners in which those services are used. In particular, providers are free to design and implement, in accordance with Union law, measures based on their existing practices to detect online child sexual abuse . Specific measures could include providing their services and indicate as part of the risk reporting their willingness and preparedness to eventually being issued a detection order under this Regulation, if deemed necessary by the competent national authorityechnical measures and tools that allow users to manage their own privacy visibility, reachability and safety, such as mechanisms for users to block or mute other users, mechanisms that ask for confirmation before displaying certain content, tools that prompt or warn users.
Amendment 191 #
Proposal for a regulation
Recital 17 a (new)
Recital 17 a (new)
(17 a) Relying on providers for risk mitigation measures comes with inherent problems, as business models, technologies and crimes evolve continuously. As a result, clear targets, oversight, review and adaptation, led by national supervisory authorities are needed, to avoid measures becoming redundant, disproportionate, ineffective, counterproductive and outdated.
Amendment 195 #
Proposal for a regulation
Recital 18
Recital 18
(18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available number-independent interpersonal communications services should, when designing and implementing the mitigationspecific measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigationspecific measures that should reasonably be taken in a given situation, account should also be taken of the ongoing effectiveness of the measures and the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigationspecific measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such as those based on industry best practices, including as established through self-regulatory cooperation, and those contained in guidelines from the Commission. When no risk has been detected after a diligently conducted or updated risk assessment, providers should not be required to take any mitigation measuresquantifying the expected impact of the available measures. Objective data on ongoing effectiveness must be provided, in order for any measure to be recognised as best practice.
Amendment 196 #
Proposal for a regulation
Recital 19
Recital 19
(19) In the light of their role as intermediaries facilitating access to software applications that may be misused for online child sexual abuse, providers of software application stores should be made subject to obligations to take certain reasonable measures to assess and mitigate that risk. The providers should make that assessment in a diligent manner, making efforts that are reasonable under the given circumstances, having regard inter alia to the nature and extent of that risk as well as their financial and technological capabilities and size, and cooperating with the. The providers should only make available on their platform software applications if prior to the use of their service they have obtained the contact details of the provider of software application developing team with the cooperation of the specific providers of the services offered through the software application where possible.
Amendment 198 #
Proposal for a regulation
Recital 20
Recital 20
(20) With a view to ensuring effective prevention and fight against online child sexual abuse, when mitigating measures are deemed insufficient to limit the risk of misuse of a certain service for the purpose of online child sexual abusdata on the impact of specific measures demonstrate that their impact is below pre-determined targets e, the Coordinating Authorities designated by Member States under this Regulation should be empowered to request the issuance of detection ordersmplementation of additional targeted and proportionate risk mitigation measures . In order to avoid any undue interference with fundamental rights and to ensure proportionality, that power should be subject to a carefully balanced set of targets, limits and safeguards. For instance, considering that child sexual abuse material tends to be disseminated through hosting services and publicly available number-independent interpersonal communications services, and that solicitation of children mostly takes place in publicly available number-independent interpersonal communications services, it should only be possible to address detecinvestigation orders to providers of such services in relation to specific suspects or specific group of suspects or a specific incident .
Amendment 201 #
Proposal for a regulation
Recital 21
Recital 21
(21) Furthermore, as parts of those limits and safeguards, detecinvestigation orders should only be issued after a diligent and objective assessment leading to the finding of a significant risk of the specific service concerned being misused for a given type of online child sexual abuse covered by this Regulation. One of the elements to be taken into account in this regard is the likelihood that the service is used to an appreciable extent, that is, beyond isolated and relatively rare instances, for such abuse. The criteria should vary so as to account of the different characteristics of the various types of online child sexual abuse at stake and of the different characteristics of the services used to engage in such abuse, as well as the related different degree of intrusiveness of the measures to be taken to execute the detection orderthat the order is necessary and proportionate .
Amendment 203 #
Proposal for a regulation
Recital 22
Recital 22
(22) However, the finding of such a significant risk should in itself be insufficient to justify the issuance of a detection order, given that in such a case the order might lead to disproportionate negative consequences for the rights and legitimate interests of other affected parties, in particular for the exercise of users’ fundamental rights. Therefore, it should be ensured that detecIt should be ensured that investigation orders can be issued only after the Coordinating Authorities and the competent judicial authority or independent administrative authority having objectively and diligently assessed, identified and weighted, on a case-by-case basis, not only the likelihood and seriousness of the potential consequences of the service being misused for the type of online child sexual abuse at issue, but also the specific results anticipated by the measure, the likelihood and seriousness of any potential negative consequences for other parties affected. With a view to avoiding the imposition of excessive burdens, the assessment should also take account of the financial and technological capabilities and size of the provider concerned.
Amendment 205 #
Proposal for a regulation
Recital 23
Recital 23
(23) In addition, to avoid undue interference with fundamental rights and ensure proportionality, when it is established that those requirements have been met and a detecinvestigation order is to be issued, it should still be ensured that the detection order is targeted and specifiedc enough so as to ensure that any such negative consequences for affected parties do not go beyond what is strictly necessary to effectively address the significant risk identified. This should concern, in particular, a limitation to an identifiable part or component of the service where possible without prejudice to the effectiveness of the measure, such as specific types of channels of a publicly available number-independent interpersonal communications service, or to specific users or specific groups of users, to the extent that they can be taken in isolation for the purpose of detection, as well as the specification of the safeguards additional to the ones already expressly specified in this Regulation, such as independent auditing, the provision of additional information or access to data, or reinforced human oversight and review, and the further limitation of the duration of application of the detection order that the Coordinating Authority deems necessary. To avoid unreasonable or disproportionate outcomes, such requirements should be set after an objective and diligent assessment conducted on a case-by-case basis.
Amendment 208 #
Proposal for a regulation
Recital 24
Recital 24
(24) The competent judicial authority or the competent independent administrative authority, as applicable in accordance with the detailed procedural rules set by the relevant Member State, shouldshould have the data necessary to be in a position to take a well-informed decision on requests for the issuance of detecinvestigations orders. That is of particular importance to ensure the necessary fair balance of the fundamental rights at stake and a consistent approach, especially in connection to detecinvestigation orders concerning the solicitation of children. Therefore, a procedure should be provided for that allows the providers concerned, the EU Centre on Child Sexual Abuse established by this Regulation (‘EU Centre’) and, where so provided in this Regulation, the competent data protection authority designated under Regulation (EU) 2016/679 to provide their views on the measures in question. They should do so as soon as possiblewithout undue delay , having regard to the important public policy objective at stake and the need to act without undue delay to protect children. In particularFurthermore, data protections authorities should do their utmost to avoid extending the time period set out in Regulation (EU) 2016/679 for providing their opinions in response to a prior consultation. Furthermore, they should normally be able to provide their opinion well within that time periodin a timely manner in situations where the European Data Protection Board has already issued guidelines regarding the technologies that a provider envisages deploying and operating to execute a detecinvestigation order addressed to it under this Regulation.
Amendment 210 #
Proposal for a regulation
Recital 25
Recital 25
Amendment 212 #
Proposal for a regulation
Recital 26
Recital 26
(26) The measures taken by providers of hosting services and providers of publicly available number-independent interpersonal communications services to execute detecinvestigation orders addressed to them should remain strictly limited to what is specified in this Regulation and in the detection orders issued in accordance with this Regulation. In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, t. This Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulationare not undermined . That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them orAny weakening of encryption could potentially be abused by malicious their employees for purposes other than compliance withd parties. Nothing in this Regulation, n should therefore by third parties, and thus to avoid undermining the security and confidentiality of the communications of userse interpreted as prohibiting or weakening end-to-end encryption.
Amendment 214 #
Proposal for a regulation
Recital 27
Recital 27
(27) In order to facilitate the providers’ compliance with the detection obligationinvestigation orders, the EU Centre should make available to providers detectionapproved technologies that they may choose to use, on a free-of-charge basis, for the sole purpose of executing the detecinvestigation orders addressed to them. The European Data Protection Board should be consulted on the acceptability or otherwise of those technologies and the ways in which they should be best deployed to ensure, if at all, in compliance with applicable rules of Union law on the protection of personal data. The advice and with the Charter of Fundamental Rights. The authoritative position of the European Data Protection Board should be fully taken into account by the EU Centre when compiling the lists of available technologies and also by the Commission when preparing guidelines regarding the application of the detection obligations. The providers may operate the technologies made available by the EU Centre or by others or technologies that they developed themselves, as long as they meet the requirements of this Regulation and other applicable EU law, such as Regulation 2016/679.
Amendment 215 #
Proposal for a regulation
Recital 28
Recital 28
(28) With a view to constantly assess the performance of the detection technologies and ensure that they are sufficiently reliableaccurate , as well as to identify false positives and avoid to the extentfalse negatives and to avoid erroneous reporting to the EU Centre, providers should ensure adequate human oversight and, where necessary, human intervention, adapted to the type of detection technologies and the type of online child sexual abuse at issue. Such oversight should include regular assessment of the rates of false negatives and positives generated by the technologies, based on an analysis of anonymised representative data samples. In particular where the detection of the solicitation of children in publicly available interpersonal communications is concerned, service providers should ensure regular, specific and detailed human oversight and human verification of conversations identified by the technologies as involving potential solicitation of children.
Amendment 216 #
Proposal for a regulation
Recital 29
Recital 29
(29) Providers of hosting services and providers of publicly available interpersonal communications services are uniquely positioned to detect potential online child sexual abuse involving their services. The information that theyThe information providers may obtain when offering their services is often indispensable to effectively investigate and prosecute child sexual abuse offences. Therefore, they should be required to report on potential online child sexual abuse on their services, whenever they become aware of it, that is, when there are reasonable grounds to believe that a particular activity may constitute online child sexual abuse. Where such reasonable grounds exist, doubts about the potential victim’s age should not prevent those providers from submitting reports. In the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Such awareness could, for example, be obtained through the execution of detection orders, information flagged by users or organisations acting in the public interest against child sexual abuse, or activities conducted on the providers’ own initiativeIn the interest of effectiveness, it should be immaterial in which manner they obtain such awareness. Those providers should report a minimum of information, as specified in this Regulation, for competent law enforcement authorities to be able to assess whether to initiate an investigation, where relevant, and should ensure that the reports are as complete as possible before submitting them.
Amendment 220 #
Proposal for a regulation
Recital 32
Recital 32
(32) The obligations of this Regulation do not apply to providers of hosting services that do not offer their services in the Union. However, such services may still be used to disseminate child sexual abuse material to or by users in the Union, causing harm to children and society at large, even if the providers’ activities are not targeted towards Member States and the total numbers of users of those services in the Union are limited. For legal and pIn view of the very high number of ractical reasons, it may not be reasonably possible to have those providers remove or disable access to the material, not even through cooperation with the competent authorities of the third country where they are established. Therefore, in line with existing practices in severalfications of the UN Convention on the Rights of the Child or its optional Protocol on Child Pornography globally, it should always be possible to have those providers remove or disable access to the material. Where problems arise in relation to specific jurisdictions, Commission and Member States, it should be possible to require providers of internet access services to take reasonable measures to block the access of users in the Union to the materialuse all reasonable means at their disposal to encourage and lead international efforts to remedy the situation.
Amendment 221 #
Amendment 225 #
Proposal for a regulation
Recital 44
Recital 44
(44) In order to provide clarity and enable effective, efficient and consistent coordination and cooperation both at national and at Union level, where a Member State designates more than one competent authority to apply and enforce this Regulation, it should designate one lead authority as the Coordinating Authority, whilst the designated authority should automatically be considered the Coordinating Authority where a Member State designates only one authority. For those reasons, the Coordinating Authority should act as the single contact point with regard to all matters related to the application of this Regulationcontributing to the achievements of the objective of this Regulation, , including for trusted organisations providing assistance to victims and providing education and awareness raising, without prejudice to the enforcement powers of other national authorities.
Amendment 226 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to verify that the rules of this Regulation, in particular those on mitigationspecific measures and on the execution of detecinvestigation orders, removal orders or blocking orders that it issued, are effectively complied in practicewith, each Coordinating Authority should be able to carry out searches, using the relevant indicators provided by the EU Centre, to detect the dissemination of known or new child sexual abuse material through publicly available material in the hosting services of the providers concernedrelevant searches.
Amendment 227 #
Proposal for a regulation
Recital 50
Recital 50
Amendment 228 #
Proposal for a regulation
Recital 55
Recital 55
(55) It is essential for the proper functioning of the system of mandatory detection and blocking of online child sexual abuse set up by this Regulation that the EU Centre receives, via the Coordinating Authorities, anonymised specific items of material identified as constituting child sexual abuse material or transcripts of conversations identified as constituting the solicitation of children related to a specific person or a specific group of people or specific incident , such as may have been found for example during criminal investigations, so that that material or conversations can serve as an accurate and reliable basis for the EU Centre to generate indicators of such abuses. In order to achieve that result, the identification should be made after a diligent assessment, conducted in the context of a procedure that guarantees a fair and objective outcome, either by the Coordinating Authorities themselves or by a court or another independent administrative authority than the Coordinating Authority. Whilst the swift assessment, identification and submission of such material is important also in other contexts, it is crucial in connection to new child sexual abuse material and the solicitation of children reported under this Regulation, considering that this material can lead to the identification of ongoing or imminent abuse and the rescuing of victims. Therefore, specific time limits should be set in connection to such reporting.
Amendment 229 #
Proposal for a regulation
Recital 55 a (new)
Recital 55 a (new)
(55 a) All communications containing illegal material should be encrypted to state of the art standards, all access by staff to such content should be limited to what is necessary and thoroughly logged.
Amendment 230 #
Proposal for a regulation
Recital 56
Recital 56
(56) With a view to ensuring that the indicators generated by the EU Centre for the purpose of detection are as complete as possible, the submission of relevant material and transcripts should be done proactively by the Coordinating Authorities. However, the EU Centre should also be allowed to bring certain material or conversations to the attention of the Coordinating Authorities for those purposes.
Amendment 234 #
Proposal for a regulation
Recital 78
Recital 78
(78) Regulation (EU) 2021/1232 of the European Parliament and of the Council45provides for a temporary solution in respect of the use of technologies by certain providers of publicly available interpersonal communications services for the purpose of combating online child sexual abuse, pending the preparation and adoption of a long-term legal framework.This Regulation provides that long-term legal framework. Regulation (EU) 2021/1232 should therefore be repealed. _________________ 45 Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number- independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse (OJ L 274, 30.7.2021, p. 41).
Amendment 235 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 1
Article 1 – paragraph 1 – subparagraph 1
This Regulation lays down uniform rules to address the misuse of relevant information society services for online child sexual abuse in the internal market. order to contribute to the proper functioning of the internal market and to create a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter are effectively protected.
Amendment 239 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point b
Article 1 – paragraph 1 – subparagraph 2 – point b
(b) obligations on providers of hosting services and providers of publicly available number-independent interpersonal communication services to idetectntify and report online child sexual abuse;
Amendment 244 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point e a (new)
Article 1 – paragraph 1 – subparagraph 2 – point e a (new)
(e a) Obligations on providers of online games;
Amendment 247 #
Proposal for a regulation
Article 1 – paragraph 4
Article 1 – paragraph 4
4. This Regulation limits the exercise of the rights and obligations provided for in 5(1) and (3) and Article 6(1) of Directive 2002/58/EC insofar as necessary for the execution of the detecinvestigation orders issued in accordance with Section 2 of Chapter 1 of this Regulation.
Amendment 250 #
Proposal for a regulation
Article 2 – paragraph 1 – point a
Article 2 – paragraph 1 – point a
(a) ‘hosting service’ means an information society service as defined in Article 23, point (fg), third indent, of Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC];
Amendment 252 #
Proposal for a regulation
Article 2 – paragraph 1 – point b
Article 2 – paragraph 1 – point b
(b) ‘number-independent interpersonal communications service’ means a publicly available service as defined in Article 2, point 57, of Directive (EU) 2018/1972, including services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service;
Amendment 256 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
Article 2 – paragraph 1 – point b a (new)
(b a) ‘number-independent interpersonal communications service within games’ means any service defined in Article 2, point 7 of Directive (EU) 2018/1972 which is part of a game.
Amendment 259 #
Proposal for a regulation
Article 2 – paragraph 1 – point e
Article 2 – paragraph 1 – point e
Amendment 261 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point ii
Article 2 – paragraph 1 – point f – point ii
(ii) an publicly available number- independent interpersonal communications service;
Amendment 262 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv
Article 2 – paragraph 1 – point f – point iv
Amendment 264 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – point iv a (new)
Article 2 – paragraph 1 – point f – point iv a (new)
(iv a) online games;
Amendment 268 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
Article 2 – paragraph 1 – point h a (new)
(h a) ‘hotline’ means an organisation providing a mechanism, other than the reporting channels provided by law enforcement agencies, for receiving anonymous report from the public about alleged child sexual abuse material and online child sexual exploitation, which is officially recognised by the Member State of establishment and has the mission of combatting child sexual abuse;
Amendment 269 #
Proposal for a regulation
Article 2 – paragraph 1 – point h b (new)
Article 2 – paragraph 1 – point h b (new)
(h b) ‘help-line’ means an organisation providing services for children in need as recognised by the Member State of establishment;
Amendment 278 #
Proposal for a regulation
Article 3 – paragraph 1
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the risk of usepublicly available number- independent interpersonal communications services that are exposed to substantial amount of child sexual abuse material shall identify, analyse and assess, for each such service that they offer, risks stemming from the design, functioning , including algorithmic systems of the service for the purpose of online child sexual abuse.
Amendment 280 #
Proposal for a regulation
Article 3 – paragraph 1 a (new)
Article 3 – paragraph 1 a (new)
1 a. A hosting service provider or publicly available number-independent interpersonal communication service is exposed to child sexual abuse material where the coordinating authority of the Member State of its main establishment or where its legal repr esentative resides or is established has: a) taken a decision, on the basis of objecti ve factors, such as the provider having rec eived two or more final removal orders in the previous 12 m onths, finding that the provider is exposed to child sexual abuse material;and b) notified the decision referred to in point (a) to the provider.
Amendment 281 #
Proposal for a regulation
Article 3 – paragraph 2 – point a
Article 3 – paragraph 2 – point a
Amendment 282 #
Proposal for a regulation
Article 3 – paragraph 2 – point a a (new)
Article 3 – paragraph 2 – point a a (new)
(a a) any actual or foreseeable negative effects for the exercise of fundamental rights
Amendment 283 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – introductory part
Article 3 – paragraph 2 – point b – introductory part
(b) the existence and implementation by the provider of a policy and the availability and effectiveness of functionalities to address the risk referred to in paragraph 1, including through the following:
Amendment 284 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 1
Article 3 – paragraph 2 – point b – indent 1
Amendment 285 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 2
Article 3 – paragraph 2 – point b – indent 2
Amendment 289 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
Article 3 – paragraph 2 – point b – indent 3
— functionalities enabling age verificationthe effective protection of children online ;
Amendment 296 #
Proposal for a regulation
Article 3 – paragraph 2 – point c
Article 3 – paragraph 2 – point c
Amendment 297 #
Proposal for a regulation
Article 3 – paragraph 2 – point d
Article 3 – paragraph 2 – point d
(d) the manner in which the provider designed and operates the service, including the business model, governance and relevant systems and processes, the design of their recommender systems and any other relevant algorithmic system and the impact thereof on that risk;
Amendment 298 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point i
Article 3 – paragraph 2 – point e – point i
Amendment 299 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point ii
Article 3 – paragraph 2 – point e – point ii
Amendment 300 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 1
Article 3 – paragraph 2 – point e – point iii – indent 1
— enabling users to publicly search for other users and, in particular, for adult users to search for child users;
Amendment 302 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 2
Article 3 – paragraph 2 – point e – point iii – indent 2
— enabling users to establishinitiate unsolicited contact with other users directly, in particular through private communications;
Amendment 311 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 1
Article 3 – paragraph 3 – subparagraph 1
The provider may request the EU Centre to perform an analysis of representative, anonymized data samples to identify potential online child sexual abuse,methodology for risk assessment in order to support the risk assessment.
Amendment 313 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 2
Article 3 – paragraph 3 – subparagraph 2
The costs incurred by the EU Centre for the performance of such an analysis shall be borne by the requesting provider. However, the EU Centre shall bear those costs where the provider is a micro, small or medium-sized enterprise, provided the request is. The Centre may reject the request on the basis that it is not reasonably necessary to support the risk assessment.
Amendment 314 #
Proposal for a regulation
Article 3 – paragraph 3 – subparagraph 3
Article 3 – paragraph 3 – subparagraph 3
Amendment 317 #
Proposal for a regulation
Article 3 – paragraph 4 – subparagraph 2 – point a
Article 3 – paragraph 4 – subparagraph 2 – point a
(a) for a service which is subject to a detecinvestigation order issued in accordance with Article 7, the provider shall update the risk assessment at the latest two months before the expiry of the period of application of the detection order;
Amendment 319 #
Proposal for a regulation
Article 4 – title
Article 4 – title
Amendment 320 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services sthall take reasonable mitigationt are exposed to substantial amount of child sexual abuse material shall take proportionate and effective specific measures, tailored to the serious risk identified pursuant to Article 3, to minimise that risk. Such measures shall include some or all of the following:
Amendment 325 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
Article 4 – paragraph 1 – point a
(a) adapting, through appropriate technical and operational measures and staffing, the provider’s content moderation or recommender systems, its decision- making processes, the operation or functionalities of the service, or the content or enforcement of its terms and conditions;
Amendment 326 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
Article 4 – paragraph 1 – point a a (new)
(a a) providing technical measures and tools that allow users to manage their own privacy, visibility, reachability and safety , and that are set to the most secure levels by default;
Amendment 329 #
Proposal for a regulation
Article 4 – paragraph 1 – point a b (new)
Article 4 – paragraph 1 – point a b (new)
(a b) informing users about external resources and services in the user’s region on preventing child sexual abuse, counselling by help-lines, victim support and educational resources by hotlines and child protection organisation;
Amendment 331 #
Proposal for a regulation
Article 4 – paragraph 1 – point a c (new)
Article 4 – paragraph 1 – point a c (new)
(a c) providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line
Amendment 332 #
Proposal for a regulation
Article 4 – paragraph 1 – point a d (new)
Article 4 – paragraph 1 – point a d (new)
(a d) automatic mechanisms and interface design elements to inform users about external preventive intervention programmes
Amendment 333 #
Proposal for a regulation
Article 4 – paragraph 1 – point b
Article 4 – paragraph 1 – point b
(b) reinforcadapting the provider’s internal processes or the internal supervision of the functioning of the service;
Amendment 340 #
Proposal for a regulation
Article 4 – paragraph 2 – introductory part
Article 4 – paragraph 2 – introductory part
2. The mitigationspecific measures shall be:
Amendment 341 #
Proposal for a regulation
Article 4 – paragraph 2 – point a
Article 4 – paragraph 2 – point a
(a) effective and proportionate in mitigating the identified serious risk;
Amendment 343 #
Proposal for a regulation
Article 4 – paragraph 2 – point a a (new)
Article 4 – paragraph 2 – point a a (new)
(a a) subject to an implementation plan with clear objectives and methodologies for identifying and quantifying impacts on the identified serious risk and on the exercise of the fundamental rights of all affected parties. The implementation plan shall be reviewed every six months.
Amendment 345 #
Proposal for a regulation
Article 4 – paragraph 2 – point b
Article 4 – paragraph 2 – point b
(b) targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk , any impact on the functionality of the service as well as the provider’s financial and technological capabilities and the number of users;
Amendment 348 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
Amendment 355 #
Proposal for a regulation
Article 4 – paragraph 3 a (new)
Article 4 – paragraph 3 a (new)
3 a. Any requirement to take specific measures shall be without prejudice to Article 8 of Regulation (EU) 2022/2065 and shall entail neither a general obligation for hosting services providers to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.
Amendment 357 #
Proposal for a regulation
Article 4 – paragraph 4
Article 4 – paragraph 4
4. PWhere appropriate providers of hosting services and providers of interpersonal communications services shall clearly describe in their terms and conditionsof service the mitigation measures that they have taken. That description shall not include information that mayis likely to reduce the effectiveness of the mitigation measures.
Amendment 359 #
Proposal for a regulation
Article 4 – paragraph 5
Article 4 – paragraph 5
5. The Commission, in cooperation with Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelines on the application of paragraphs 1, 2, 3 and 42, having due regard in particular to relevant technological developments and in the manners in which the services covered by those provisions are offered and used.
Amendment 361 #
Proposal for a regulation
Article 4 a (new)
Article 4 a (new)
Article 4 a Specific measures for platforms primarily used for the dissemination of pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure a. user-friendly reporting mechanisms to report alleged child sexual abuse material; b. adequate professional human content moderation to rapidly process notices of alleged child sexual abuse material; c. automatic mechanisms and interface design elements to inform users about external preventive intervention programmes in the user’s region.
Amendment 362 #
Proposal for a regulation
Article 4 b (new)
Article 4 b (new)
Article 4 b Specific measures for number- independent interpersonal communications service within games Providers of online games that operate number-independent interpersonal communications service within their games shall take the necessary technical and organisational measures a) preventing users from initiating unsolicited contact with other users; b) facilitating user-friendly reporting of alleged child sexual abuse material; c) providing technical measures and tools that allow users to manage their own privacy, visibility reachability and safety. and that are set to the most secure levels by default; d) providing tools in a prominent way on their platform that allow users and potential victims to seek help from their local help-line.
Amendment 363 #
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
Article 5 – paragraph 1 – introductory part
1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services shall transmit, by three months from the date referred to in Article 3(4), to the Coordinating Authority of establishment a report specifying the following:
Amendment 365 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
Article 5 – paragraph 1 – point a
(a) the process and the results of the risk assessment conducted or updated pursuant to Article 3, including the assessment of any potential remaining serious risk referred to in Article 3(5);
Amendment 366 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) any mitigationspecific measures taken pursuant to Article 4.
Amendment 367 #
Proposal for a regulation
Article 5 – paragraph 2
Article 5 – paragraph 2
2. Within three months after receiving the report, the Coordinating Authority of establishment shall assess it and determine, on that basis and taking into account any other relevant information available to it, whether the risk assessment has been carried out or updated and the mitigation measurespecific measures and implementation plans have been taken in accordance with the requirements of Articles 3 and 4.
Amendment 369 #
Proposal for a regulation
Article 5 – paragraph 3 – subparagraph 1
Article 5 – paragraph 3 – subparagraph 1
Where necessary for that assessment, that Coordinating Authority may require further information from the provider, within a reasonable time period set by that Coordinating Authority. That time period shall not be longer than two weeksto be provided without undue delay .
Amendment 370 #
Proposal for a regulation
Article 5 – paragraph 3 – subparagraph 2
Article 5 – paragraph 3 – subparagraph 2
Amendment 371 #
Proposal for a regulation
Article 5 – paragraph 4
Article 5 – paragraph 4
4. Without prejudice to Articles 7 and 27 to 29, where the requirements of Articles 3 and 4 have not been met, that Coordinating Authority shall require the provider to re-conduct ormake specific updates to the risk assessment or to introduce, review, discontinue or expand, as applicable, the mitigation measures, within a reasonable time period set by that Coordinating Authority. That time period shall not be longer than one month.
Amendment 372 #
Proposal for a regulation
Article 5 – paragraph 6
Article 5 – paragraph 6
Amendment 377 #
Proposal for a regulation
Article 6 – paragraph 1 – point a
Article 6 – paragraph 1 – point a
(a) make reasonable efforts to assess, where possible together with the providers ofto ensure that software applications, whether each service offered through the software applications that they intermediate presents a risk of being used for the purpose of the socan only make available on their platform software applications if prior to the use of their service they have obtained the contact details of the provider of software applicitation of childrendeveloping team ;
Amendment 378 #
Proposal for a regulation
Article 6 – paragraph 1 – point b
Article 6 – paragraph 1 – point b
Amendment 380 #
Proposal for a regulation
Article 6 – paragraph 1 – point c
Article 6 – paragraph 1 – point c
Amendment 382 #
Proposal for a regulation
Article 6 – paragraph 2
Article 6 – paragraph 2
Amendment 383 #
Proposal for a regulation
Article 6 – paragraph 3
Article 6 – paragraph 3
Amendment 384 #
Proposal for a regulation
Article 6 – paragraph 4
Article 6 – paragraph 4
Amendment 388 #
Proposal for a regulation
Article 6 a (new)
Article 6 a (new)
Article 6 a Security of communications and services Nothing in this regulation shall be construed as encouraging the prohibition, restriction, circumvention or undermining of the provision or the use of encrypted services.
Amendment 391 #
Proposal for a regulation
Chapter II – Section 2 – title
Chapter II – Section 2 – title
2 DetecInvestigation obligations
Amendment 393 #
Proposal for a regulation
Article 7 – title
Article 7 – title
Issuance of detecinvestigation orders
Amendment 396 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detecto issue an investigation order requiring a provider of hosting services or a provider of publicly available number-independent interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detectassist in investigations of a specific person, specific group of people, or a specific incident related to online child sexual abuse on a specific service.
Amendment 401 #
Proposal for a regulation
Article 7 – paragraph 2 – subparagraph 1
Article 7 – paragraph 2 – subparagraph 1
The Coordinating Authority of establishment shall, before requesting the issuance of a detecn investigation order, carry out the investigations and assessments necessary to determine whether the conditions of paragraph 4 have been met.
Amendment 403 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – introductory part
Article 7 – paragraph 3 – subparagraph 1 – introductory part
Where the Coordinating Authority of establishment takes the preliminary view that the conditions of paragraph 4 have been met, it shall:
Amendment 404 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 1 – point a
Article 7 – paragraph 3 – subparagraph 1 – point a
(a) establish a draft request for the issuance of a detecn investigation order, specifying the factual and legal grounds upon which the request is based, the main elements of the content of the detecinvestigation order it intends to request and the reasons for requesting it;
Amendment 406 #
Amendment 408 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – introductory part
Article 7 – paragraph 3 – subparagraph 2 – introductory part
Where, having regard to the comments of the provider and the opinion of the EU Centre, that Coordinating Authority continues to be of the view that the conditions of paragraph 4 havare met, it shall re-submit the draft request, adjusted where appropriate, to the provider. In that case,quest the judicial validation of the inquiry/investigation order from the competent judicial authority responsible for the issuing of such orders pursuant to paragraph 4. Upon receipt of judicial validation of the order, the Coordinating Authority shall submit the order, adjusted where appropriate, to the provider. Prior to requesting the judicial validation of the investigation order, the Coordinating Authority shall request the provider shallto do all of the following, within a reasonable time period set by that Coordinating Authority: :
Amendment 410 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point a
Article 7 – paragraph 3 – subparagraph 2 – point a
(a) draft an implementation plan setting out the incident that the authority intends to investigate, the measures it envisages taking to execute the intended detecinvestigation order, including detailed information regarding the envisaged technologies and safeguards;
Amendment 412 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point b
Article 7 – paragraph 3 – subparagraph 2 – point b
(b) where the draft implementation plan concerns an intended detecinvestigation order concerning the solicitation of children other than the renewal of a previously issued detecinvestigation order without any substantive changes, conduct a data protection impact assessment and a prior consultation procedure as referred to in Articles 35 and 36 of Regulation (EU) 2016/679, respectively, in relation to the measures set out in the implementation plan;
Amendment 414 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point c
Article 7 – paragraph 3 – subparagraph 2 – point c
(c) where point (b) applies, or where the conditions of Articles 35 and 36 of Regulation (EU) 2016/679 are met, adjust the draft implementation plan, where necessary in view of the outcome of the data protection impact assessment and in order to take intoutmost account of the opinion of the data protection authority provided in response to the prior consultation;
Amendment 415 #
Proposal for a regulation
Article 7 – paragraph 3 – subparagraph 2 – point d
Article 7 – paragraph 3 – subparagraph 2 – point d
(d) submit to that Coordinating Authority the implementation plan, where applicable attaching the opinion of the competent data protection authority and specifying how the implementation plan has been adjusted in viewto take full account of the outcome of the data protection impact assessment and of that opinion.
Amendment 417 #
Where, having regard to the implementation plan of the provider and taking taking utmost account of the opinion of the data protection authority, that Coordinating Authority continues to beis of the view that the conditions of paragraph 4 have met, it shall submit the request for the validation and issuance of the detectioninvestigation order, adjusted where appropriate, to the competent judicial authority or independent administrative authority. It shall attach the implementation plan of the provider and the opinions of the EU Centre and the data protection authority to that request.
Amendment 421 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – introductory part
Article 7 – paragraph 4 – subparagraph 1 – introductory part
Based on a reasoned justification, The Coordinating Authority of establishment shall request the issuance of the detecinvestigation order, and the competent judicial authority or independent administrative authority shall issue the detecinvestigation order where it considers that the following conditions are met:
Amendment 422 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point a
Article 7 – paragraph 4 – subparagraph 1 – point a
(a) there is evidence of a signpecificant risk of the service being used for the purpose of online child sexual abuse by one or more suspects , within the meaning of paragraphs 5, 6 and 7, as applicable;
Amendment 424 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 1 – point b
Article 7 – paragraph 4 – subparagraph 1 – point b
(b) the reasons for issuing of the detection orderthe investigation order is necessary and proportionate and outweighs negative consequences for the rights and legitimate interests of all parties affected, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties.
Amendment 426 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2
Article 7 – paragraph 4 – subparagraph 2
Amendment 427 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point a
Article 7 – paragraph 4 – subparagraph 2 – point a
Amendment 428 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point b
Article 7 – paragraph 4 – subparagraph 2 – point b
Amendment 429 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point c
Article 7 – paragraph 4 – subparagraph 2 – point c
Amendment 430 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 2 – point d
Article 7 – paragraph 4 – subparagraph 2 – point d
Amendment 431 #
Proposal for a regulation
Article 7 – paragraph 4 – subparagraph 3
Article 7 – paragraph 4 – subparagraph 3
Amendment 433 #
Proposal for a regulation
Article 7 – paragraph 5 – introductory part
Article 7 – paragraph 5 – introductory part
5. As regards detecinvestigation orders concerning the dissemination of known child sexual abuse material, the signpecificant risk referred to in paragraph 4, first subparagraph, point (a), shall be deemed to exist where the following conditions are met:
Amendment 435 #
Proposal for a regulation
Article 7 – paragraph 5 – point a
Article 7 – paragraph 5 – point a
(a) it is likely, despite any mitigation measures that the provider may have taken or will take, that the service is usedbeing used by the suspect or suspects, to an appreciable extent for the dissemination of known child sexual abuse material;
Amendment 437 #
Proposal for a regulation
Article 7 – paragraph 5 – point b
Article 7 – paragraph 5 – point b
(b) there is evidence of the service, or of a comparable service if the service has not yet been offered in the Union at the date of the request for the issuance of the detection order, having been used in the past 12 months and to an appreciable extentby one or more suspects for the dissemination of known child sexual abuse material.
Amendment 438 #
Proposal for a regulation
Article 7 – paragraph 6
Article 7 – paragraph 6
Amendment 442 #
Proposal for a regulation
Article 7 – paragraph 7
Article 7 – paragraph 7
Amendment 446 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 1
Article 7 – paragraph 8 – subparagraph 1
The Coordinating Authority of establishment when requesting the judicial validation and the issuance of detecinvestigation orders, and the competent judicial or independent administrative authority when issuing the detecinvestigation order, shall target and specify it in such a manner that the negative consequences referred to in paragraph 4, first subparagraph, point (b), remain limited to what is strictly necessary to effectively address the significant risk referred to in point (a) thereofand proportionate to obtain the information required to to effectively investigate the case, and collect the information required to assess the existence of a criminal offence.
Amendment 449 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 2
Article 7 – paragraph 8 – subparagraph 2
To that aimend, they shall take into account all relevant parameters, including the availability of sufficiently reliable detectioninvestigative technologies in that they limit to the maximum extent possible the rate of errors regarding the detecinvestigation and their suitability and effectiveness for achieving the objectives of this Regulation, as well as the impact of the measures on the rights of the users affected, and require the taking of the least intrusive measures, in accordance with Article 10, from among several equally effective measures.
Amendment 450 #
Proposal for a regulation
Article 7 – paragraph 8 – subparagraph 3 – point b
Article 7 – paragraph 8 – subparagraph 3 – point b
(b) where necessary, in particular to limit such negative consequences, effective and proportionate safeguards additional to those listed in Article 10(4), (5) and (65) are provided for;
Amendment 455 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 1
Article 7 – paragraph 9 – subparagraph 1
The competent judicial authority or independent administrative authority shall specify in the detecinvestigation order the period during which it applies, indicating the start date and the end date.
Amendment 456 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 2
Article 7 – paragraph 9 – subparagraph 2
The start date shall be set taking into account the time reasonably required for the provider to take the necessary measures to prepare the execution of the detection order. It shall not be earlier than three months from the date at which the provider received the detection order and not be later than 12 months from that dateinvestigation order.
Amendment 457 #
Proposal for a regulation
Article 7 – paragraph 9 – subparagraph 3
Article 7 – paragraph 9 – subparagraph 3
The period of application of detection orders concerning the dissemination of known or new child sexual abuse material shall not exceed 24 months and that of detection orders concerning the solicitation of children shall not exceed 12 monthsinvestigation orders shall be proportionate, taking all relevant factors into account.
Amendment 459 #
Proposal for a regulation
Article 8 – title
Article 8 – title
Additional rules regarding detecinvestigation orders
Amendment 462 #
Proposal for a regulation
Article 8 – paragraph 1 – introductory part
Article 8 – paragraph 1 – introductory part
1. The competent judicial authority or independent administrative authority shall issue the detecinvestigation orders referred to in Article 7 using the template set out in Annex I. DetecInvestigation orders shall include:
Amendment 463 #
Proposal for a regulation
Article 8 – paragraph 1 – point a
Article 8 – paragraph 1 – point a
(a) information regarding the measures to be taken to execute the detecinvestigation order, including the person, group of person or incident concerned, the temporal scope, the indicators to be used and the safeguards to be provided for, including the reporting requirements set pursuant to Article 9(3) and, where applicable, any additional safeguards as referred to in Article 7(8);
Amendment 466 #
Proposal for a regulation
Article 8 – paragraph 1 – point b
Article 8 – paragraph 1 – point b
Amendment 468 #
Proposal for a regulation
Article 8 – paragraph 1 – point d
Article 8 – paragraph 1 – point d
(d) the specific service in respect of which the detecinvestigation order is issued and, where applicable, the part or component of the service affected as referred to in Article 7(8);
Amendment 469 #
Proposal for a regulation
Article 8 – paragraph 1 – point e
Article 8 – paragraph 1 – point e
(e) whether the detecinvestigation order issued concerns the dissemination of known or newpreviously unknown child sexual abuse material or the solicitation of children;
Amendment 470 #
Proposal for a regulation
Article 8 – paragraph 1 – point f
Article 8 – paragraph 1 – point f
(f) the start date and the end date of the detecinvestigation order;
Amendment 471 #
Proposal for a regulation
Article 8 – paragraph 1 – point g
Article 8 – paragraph 1 – point g
(g) a sufficiently detailed statement of reasons explaining why the detecinvestigation order is issued;
Amendment 472 #
Proposal for a regulation
Article 8 – paragraph 1 – point h
Article 8 – paragraph 1 – point h
(h) the factual and legal grounds justifying the issuing of the order, and a reference to this Regulation as the legal basis for the detection order;
Amendment 475 #
Proposal for a regulation
Article 8 – paragraph 1 – point i
Article 8 – paragraph 1 – point i
(i) the date, time stamp and electronic signature of the judicial or independent administrative authority issuing the detecinvestigation order;
Amendment 476 #
Proposal for a regulation
Article 8 – paragraph 1 – point j
Article 8 – paragraph 1 – point j
(j) easily understandable information about the redress available to the addressee of the detecinvestigation order, including information about redress to a court and about the time periods applicable to such redress.
Amendment 479 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1
Article 8 – paragraph 2 – subparagraph 1
The competent judicial authority or independent administrative authority issuing the detecinvestigation order shall address it to the main establishment of the provider or, where applicable, to its legal representative designated in accordance with Article 24.
Amendment 480 #
The detecinvestigation order shall be securely transmitted to the provider’s point of contact referred to in Article 23(1), to the Coordinating Authority of establishment and to the EU Centre, through the system established in accordance with Article 39(2).
Amendment 481 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 3
Article 8 – paragraph 2 – subparagraph 3
The detecinvestigation order shall be drafted in the language declared by the provider pursuant to Article 23(3).
Amendment 482 #
Proposal for a regulation
Article 8 – paragraph 3
Article 8 – paragraph 3
3. If the provider cannot execute the detecinvestigation order because it contains manifest errorserrors , or it appears unnecessary or disproportionate, or does not contain sufficient information for its execution, the provider shall, without undue delay, request the necessary correction or clarification to the Coordinating Authority of establishment, using the template set out in Annex II.
Amendment 483 #
Proposal for a regulation
Article 8 a (new)
Article 8 a (new)
Amendment 484 #
Proposal for a regulation
Article 8 c (new)
Article 8 c (new)
Article 8 c Notification mechanism 1. Providers of hosting services and providers of interpersonal communication services shall establish mechanisms that allow users to notify to them the presence on their service of specific items or activities that the user considers to be potential child sexual abuse material, in particular previously unknown child sexual abuse material and solicitation of children. Those mechanisms shall be easy to access and user-friendly, child-friendly and shall allow for the submission of notices exclusively by electronic means. 2. Where the notice contains the electroni c contact information of the user who submitted it , the provider shall without undue delay send a confirmation or receipt to the user. 3. Providers shall ensure that such notices are processed without undue delay.
Amendment 486 #
Proposal for a regulation
Article 9 – title
Article 9 – title
Redress, information, reporting and modification of detecinvestigation orders
Amendment 487 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services that have received a detecinvestigation order, as well as users affected by the measures taken to execute it, shall have a right to effective redress. That right shall include the right to challenge the detecinvestigation order before the courts of the Member State of the competent judicial authority or independent administrative authority that issued the detecinvestigation order.
Amendment 492 #
Proposal for a regulation
Article 9 – paragraph 2 – subparagraph 1
Article 9 – paragraph 2 – subparagraph 1
When the detecinvestigation order becomes final, the competent judicial authority or independent administrative authority that issued the detecinvestigation order shall, without undue delay, transmit a copy thereof to the Coordinating Authority of establishment. The Coordinating Authority of establishment shall then, without undue delay, transmit a copy thereof to all other Coordinating Authorities through the system established in accordance with Article 39(2).
Amendment 493 #
Proposal for a regulation
Article 9 – paragraph 2 – subparagraph 2
Article 9 – paragraph 2 – subparagraph 2
For the purpose of the first subparagraph, a detecn investigation order shall become final upon the expiry of the time period for appeal where no appeal has been lodged in accordance with national law or upon confirmation of the detecinvestigation order following an appeal.
Amendment 494 #
Proposal for a regulation
Article 9 – paragraph 3 – subparagraph 1
Article 9 – paragraph 3 – subparagraph 1
Where the period of application of the detecinvestigation order exceeds 12 months, or six months in the case of a detecinvestigation order concerning the solicitation of children, the Coordinating Authority of establishment shall require the provider to report to it on the execution of the detecinvestigation order at least once, halfway through the period of application.
Amendment 495 #
Proposal for a regulation
Article 9 – paragraph 3 – subparagraph 2
Article 9 – paragraph 3 – subparagraph 2
Those reports shall include a detailed description of the measures taken to execute the detecinvestigation order, including the safeguards provided, and information on the functioning in practice of those measures, in particular on their effectiveness in detecting the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, and on the consequences of those measures for the rights and legitimate interests of all parties affected.
Amendment 496 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 1
Article 9 – paragraph 4 – subparagraph 1
In respect of the detecinvestigation orders that the competent judicial authority or independent administrative authority issued at its request, the Coordinating Authority of establishment shall, where necessary and in any event following reception of the reports referred to in paragraph 3, assess whether any substantial changes to the grounds for issuing the detection orders occurred and, in particular, whether the conditions of Article 7(4) continue to be met. In that regard, it shall take account of additional mitigation measures that the provider may take to address the significant risk identified at the time of the issuance of the detecinvestigation order.
Amendment 501 #
Proposal for a regulation
Article 9 – paragraph 4 – subparagraph 2
Article 9 – paragraph 4 – subparagraph 2
That Coordinating Authority shall request to the competent judicial authority or independent administrative authority that issued the detecinvestigation order the modification or revocation of such order, where necessary in the light of the outcome of that assessment. The provisions of this Section shall apply to such requests, mutatis mutandis.
Amendment 504 #
Proposal for a regulation
Article 10 – paragraph 1
Article 10 – paragraph 1
1. Providers of hosting services and providers of publicly available number- independent interpersonal communication services that have received a detecn investigation order shall execute it by installing and operating technologies to detectto collect evidence on the dissemination of known or newpreviously unknown child sexual abuse material or the solicitation of children, as applicable, using the corresponding indicators provided, if necessary specific technologies approved for this purpose by the EU Centre in accordance with Article 46.
Amendment 505 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. The provider shall be entitled to acquire, install and operate, free of charge, technologies specified in the orders and made available by the EU Centre in accordance with Article 50(1), for the sole purpose of executing the detection order. The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met. The use of the technologies made available by the EU Centre shall not affect the responsibility of the provider to comply with those requirements and for any decisions it may take in connection to or as a result of the use of the technologiesinvestigation order.
Amendment 506 #
Proposal for a regulation
Article 10 – paragraph 3 – introductory part
Article 10 – paragraph 3 – introductory part
3. The technologies specified in the investigation orders shall be:
Amendment 507 #
Proposal for a regulation
Article 10 – paragraph 3 – point a
Article 10 – paragraph 3 – point a
(a) effective in detcollecting evidence on the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable;
Amendment 508 #
Proposal for a regulation
Article 10 – paragraph 3 – point b
Article 10 – paragraph 3 – point b
(b) not be able to extract any other information from the relevant communications than the information strictly necessary to detect,investigate , including using the indicators referred to in paragraph 1, patterns pointing to the dissemination of known or newpreviously unknown child sexual abuse material or the solicitation of children, as applicable;
Amendment 510 #
Proposal for a regulation
Article 10 – paragraph 3 – point c
Article 10 – paragraph 3 – point c
(c) in accordance with the technological state of the art in the industry and the least intrusive in terms of the impact on the users’ rights to private and family life, including the confidentiality of communication, and to protection of personal data;
Amendment 511 #
Proposal for a regulation
Article 10 – paragraph 3 – point d
Article 10 – paragraph 3 – point d
(d) sufficiently reliable, in that they limit to the maximum extent possible the rate of errors regarding the detecinvestigation.
Amendment 513 #
Proposal for a regulation
Article 10 – paragraph 4 – introductory part
Article 10 – paragraph 4 – introductory part
4. The providerissuing authority shall:
Amendment 514 #
Proposal for a regulation
Article 10 – paragraph 4 – point a
Article 10 – paragraph 4 – point a
(a) take all the necessary measures to ensure that the technologies and indicators, as well as the processing of personal data and other data in connection thereto, are usedspecified in investigation orders and indicators, are proportionate for the sole purpose of detecinvestigating the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable, insofar asnd strictly necessary to execute the detecinvestigation orders addressed to themthey issue ;
Amendment 516 #
Proposal for a regulation
Article 10 – paragraph 4 – point b
Article 10 – paragraph 4 – point b
(b) establish effectiveinclude in investigation orders specific internal procedures for providers to prevent and, where necessary, detect and remedy any misuse of the technologies, indicators and personal data and other data referred to in point (a), including unauthorized access to, and unauthorised transfers of, such personal data and other data;
Amendment 517 #
Proposal for a regulation
Article 10 – paragraph 4 – point c
Article 10 – paragraph 4 – point c
(c) include in investigation orders specific obligations on providers to ensure regular human oversight as necessary to ensure that the technologies operate in a sufficiently reliable manner and, where necessary, in particular when potential errors and potential solicitation of children are detected, human intervention;
Amendment 520 #
Proposal for a regulation
Article 10 – paragraph 4 – point d
Article 10 – paragraph 4 – point d
(d) establish and operate an accessible, age-appropriate and user-friendly mechanism that allows users to submit to it, within a reasonable timeframe, complaints about alleged infringements of itsproviders’ obligations under this Section, as well as any decisions that the provider may have taken in relation to the use of the technologies, including the removal or disabling of access to material provided by users, blocking the users’ accounts or suspending or terminating the provision of the service to the users, and process such complaints in an objective, effective and timely manner;
Amendment 522 #
Proposal for a regulation
Article 10 – paragraph 4 – point e
Article 10 – paragraph 4 – point e
(e) inform the Coordinating Authority, as appropriate, at the latest one month before the start date specified in the detecinvestigation order, on the implementation of the envisaged measures set out in the implementation plan referred to in Article 7(3);
Amendment 525 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point a
Article 10 – paragraph 5 – subparagraph 1 – point a
Amendment 526 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 1 – point b
Article 10 – paragraph 5 – subparagraph 1 – point b
Amendment 528 #
Proposal for a regulation
Article 10 – paragraph 5 – subparagraph 2
Article 10 – paragraph 5 – subparagraph 2
The provider shall not provide information to users that may reduce the effectiveness of the measures to execute the detecinvestigation order.
Amendment 529 #
Proposal for a regulation
Article 10 – paragraph 6
Article 10 – paragraph 6
Amendment 531 #
Proposal for a regulation
Article 11 – title
Article 11 – title
Guidelines regarding detecinvestigation obligations
Amendment 532 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
The Commission, in cooperation with the Coordinating Authorities and the EU Centre and after having conducted a public consultation, may issue guidelinedelegated acts on the application of Articles 7 to 10, having due regard in particular to relevant technological developments and the manners in which the services covered by those provisions are offered and used.
Amendment 533 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Where a provider of hosting services or a provider of interpersonal communications services becomes aware in any manner other than through a removal order issued in accordance with this Regulation of any information indicating potential online child sexual abuse on its services, it shall promptly submit a report thereon to the EU Centre in accordance with Article 13publicly available number-independent interpersonal communications services has actual knowledge of alleged online child sexual abuse on its services in any manner other than through a removal order issued in accordance with this Regulation, it shall promptly submit using state of the art encryption a report to the EU Centre in accordance with Article 13 and shall expeditiously remove such content , once the EU Centre confirms this will not prejudice an ongoing investigation . It shall do so through the system established in accordance with Article 39(2).
Amendment 536 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 1
Article 12 – paragraph 2 – subparagraph 1
Where the provider submits a report pursuant to paragraph 1, it shall inform the user concerned, providingrequest authorisation from the EU Centre to inform the user concerned, , which shall reply without undue delay, at maximum within two days. The notification to the user shall include information on the main content of the report, on the manner in which the provider has become aware of the potentialalleged child sexual abuse concerned, on the follow-up given to the report insofar as such information is available to the provider and on the user’s possibilities of redress, including on the right to submit complaints to the Coordinating Authority in accordance with Article 34.
Amendment 537 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 2
Article 12 – paragraph 2 – subparagraph 2
Amendment 538 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 3
Article 12 – paragraph 2 – subparagraph 3
Amendment 544 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of hosting services and providers of publicly available number- independent interpersonal communications services shall submit the report referred to in Article 12 using the template set out in Annex III. The report shall include:
Amendment 545 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
Article 13 – paragraph 1 – point c
(c) all content data, including images, videos and textencrypted versions of all content data, being reported;
Amendment 546 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) a list of all available data other than content data related to the potential online child sexual abuse preserved in line with the preservation order in Article 8a;
Amendment 547 #
Proposal for a regulation
Article 13 – paragraph 1 – point d a (new)
Article 13 – paragraph 1 – point d a (new)
(d a) a list of all traffic data and metadata retained in relation to the potential online child sexual abuse, which could be made available to law enforcement authorities, together with information concerning default storage periods.
Amendment 548 #
Proposal for a regulation
Article 13 – paragraph 1 – point e
Article 13 – paragraph 1 – point e
Amendment 549 #
Proposal for a regulation
Article 13 – paragraph 1 – point f
Article 13 – paragraph 1 – point f
Amendment 550 #
Proposal for a regulation
Article 13 – paragraph 1 – point g
Article 13 – paragraph 1 – point g
Amendment 551 #
Proposal for a regulation
Article 13 – paragraph 1 – point i
Article 13 – paragraph 1 – point i
(i) where the potentialalleged online child sexual abuse concerns the dissemination of known or newpreviously unknown child sexual abuse material, whether the provider has removed or disabled access to the material;
Amendment 552 #
Proposal for a regulation
Article 13 – paragraph 1 – point j
Article 13 – paragraph 1 – point j
(j) whether the provider considersan indication that the report requires urgent action;
Amendment 555 #
Proposal for a regulation
Article 14 – paragraph 1 a (new)
Article 14 – paragraph 1 a (new)
1 a. Before issuing a removal order, the Coordinating Authority of establishment shall take all reasonable steps to ensure that implementing the order will not interfere with activities for the prevention, detection, investigation and prosecution of child sexual abuse offences.
Amendment 561 #
Proposal for a regulation
Article 14 – paragraph 3 – point c
Article 14 – paragraph 3 – point c
Amendment 567 #
Proposal for a regulation
Article 15 – paragraph 3 – point b
Article 15 – paragraph 3 – point b
(b) the reasons for the removal or disabling, providing a copy of the removal order upon the user’s request;
Amendment 568 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
Amendment 575 #
Proposal for a regulation
Article 19
Article 19
Amendment 580 #
Proposal for a regulation
Article 25 – paragraph 5
Article 25 – paragraph 5
5. Each Member State shall ensure that a contact point is designated or established within the Coordinating Authority’s office to handle requests for clarification, feedback and other communications in relation to all matters related to the application and enforcement of this Regulation in that Member Statecontributing to the achievements of the objective of this Regulation in that Member State , including for trusted organisations providing assistance to victims and providing education and awareness raising. Member States shall make the information on the contact point publicly available and communicate it to the EU Centre. They shall keep that information updated.
Amendment 582 #
Proposal for a regulation
Article 25 – paragraph 7 – introductory part
Article 25 – paragraph 7 – introductory part
7. Coordinating Authorities may, where necessary for the performance of their tasks under this Regulation, request the assistance of the EU Centre in carrying out those tasks, in particular by requesting the EU Centre to:
Amendment 583 #
Proposal for a regulation
Article 25 – paragraph 7 – point a
Article 25 – paragraph 7 – point a
Amendment 584 #
Proposal for a regulation
Article 25 – paragraph 7 – point b
Article 25 – paragraph 7 – point b
Amendment 585 #
Proposal for a regulation
Article 25 – paragraph 7 – point c
Article 25 – paragraph 7 – point c
Amendment 586 #
Proposal for a regulation
Article 25 – paragraph 7 – point d
Article 25 – paragraph 7 – point d
Amendment 588 #
Proposal for a regulation
Article 25 – paragraph 8
Article 25 – paragraph 8
8. The EU Centre shall provide such assistance free of charge and in accordance with its tasks and obligations under this Regulation and insofar as its resources and priorities allow.
Amendment 599 #
Proposal for a regulation
Article 26 – paragraph 5
Article 26 – paragraph 5
5. Without prejudice to national or Union legislation on whistleblower protection, The management and other staff of the Coordinating Authorities shall, in accordance with Union or national law, be subject to a duty of professional secrecy both during and after their term of office, with regard to any confidential information which has come to their knowledge in the course of the performance of their tasks. Member States shall ensure that the management and other staff are subject to rules guaranteeing that they can carry out their tasks in an objective, impartial and independent manner, in particular as regards their appointment, dismissal, remuneration and career prospects.
Amendment 602 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
Article 27 – paragraph 1 – point b
(b) the power to carry out remote or on-site inspections of any premises that those providers or the other persons referred to in point (a) use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement of this Regulation in any form, irrespective of the storage medium;
Amendment 603 #
Proposal for a regulation
Article 27 – paragraph 1 – point d
Article 27 – paragraph 1 – point d
(d) the power to request information, including to assess whether the measures taken to execute a detection order, removal order or blocking order complyto assess the compliance with the requirements of this Regulation.
Amendment 604 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
(b) the power to order specific measures to bring about the cessation of infringements of this Regulation and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end;
Amendment 605 #
Proposal for a regulation
Article 29 – paragraph 1 – point b
Article 29 – paragraph 1 – point b
(b) the infringement persists and;
Amendment 606 #
Proposal for a regulation
Article 29 – paragraph 2 – point a – point i
Article 29 – paragraph 2 – point a – point i
(i) adopt and submit an action plan setting out the necessary measures to terminate the infringement , subject to the approval of the Coordinating Authority;
Amendment 608 #
Proposal for a regulation
Article 29 – paragraph 2 – point b – point ii
Article 29 – paragraph 2 – point b – point ii
(ii) the infringement persists and causes serious harm that is greater than the likely harm to users relying on the service for legal purposes and;
Amendment 609 #
Proposal for a regulation
Article 29 – paragraph 4 – subparagraph 3 – point a
Article 29 – paragraph 4 – subparagraph 3 – point a
(a) the provider has failed to take the necessary and proportionate measures to terminate the infringement;
Amendment 610 #
Proposal for a regulation
Article 30 – paragraph 2
Article 30 – paragraph 2
2. Member States shall ensure that any exercise of the investigatory and enforcement powers referred to in Articles 27, 28 and 29 is subject to adequate safeguards laid down in the applicable national law to respect the fundamental rights of all parties affected. In particular, those measures shall onlybe targeted and precise, be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all parties affected.
Amendment 611 #
Proposal for a regulation
Article 31 – paragraph 1
Article 31 – paragraph 1
Coordinating Authorities shall have the power to carry out searches on publicly accessible material on hosting services to detect the dissemination of known or new child sexual abuse material, using the indicators contained in the databases referred to in Article 44(1), points (a) and (b), where necessary to verify whether the providers of hosting services under the jurisdiction of the Member State that designated the Coordinating Authorities comply with their obligations under this Regulation.
Amendment 612 #
Proposal for a regulation
Article 32
Article 32
Amendment 615 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Member States shall ensure that the maximum amount of penalties imposed for an infringement of this Regulation shall not exceed 6 % of the annual income or globalworldwide turnover of the preceding business year of the provider.
Amendment 616 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information or to submit to an on-site inspection shall not exceed 1% of the annual income or globalworldwide turnover of the preceding business year of the provider or the other person referred to in Article 27.
Amendment 617 #
Proposal for a regulation
Article 35 – paragraph 4
Article 35 – paragraph 4
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily globalworldwide turnover of the provider or the other person referred to in Article 27 in the preceding financial year per day, calculated from the date specified in the decision concerned.
Amendment 619 #
Proposal for a regulation
Article 36 – paragraph 1 – subparagraph 1 – point a
Article 36 – paragraph 1 – subparagraph 1 – point a
(a) anonymised specific items of material and transcripts of conversations related to a specific person, specific group of people,or specific incident that Coordinating Authorities or that the competent judicial authorities or other independent administrative authorities of a Member State have identified, after a diligent assessment, as constituting child sexual abuse material or the solicitation of children, as applicable, for the EU Centre to generate indicators in accordance with Article 44(3);
Amendment 620 #
Proposal for a regulation
Article 36 – paragraph 1 – subparagraph 1 – point b
Article 36 – paragraph 1 – subparagraph 1 – point b
(b) exact uniform resource locators indicating specific items of material related to a specific person, specific group of people,or specific incident that Coordinating Authorities or that competent judicial authorities or other independent administrative authorities of a Member State have identified, after a diligent assessment, as constituting child sexual abuse material, hosted by providers of hosting services not offering services in the Union, that cannot be removed due to those providers’ refusal to remove or disable access thereto and to the lack of cooperation by the competent authorities of the third country having jurisdiction, for the EU Centre to compile thea public list of uniform resource locators in accordance with Article 44(3)states that facilitate crimes against children by not ensuring expeditious removal of child sexual abuse material.
Amendment 621 #
Proposal for a regulation
Article 36 – paragraph 1 – subparagraph 2
Article 36 – paragraph 1 – subparagraph 2
Member States shall take the necessary measures to ensure that the Coordinating Authorities that they designated receive, without undue delay,the encrypted copies of the material identified as child sexual abuse material, the transcripts of conversations related to a specific person, specific group of people,or specific incident identified as the solicitation of children, and the uniform resource locators, identified by a competent judicial authority or other independent administrative authority than the Coordinating Authority, for submission to the EU Centre in accordance with the first subparagraph.
Amendment 622 #
Proposal for a regulation
Article 37 – paragraph 1 – subparagraph 2
Article 37 – paragraph 1 – subparagraph 2
Where the Commission has reason, in the reasoned opinion of the Commission , there are grounds to suspect that a provider of relevant information society services infringed this Regulation in a manner involvcausing harm ing at least three Member States, it may recommend that the Coordinating Authority of establishment assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
Amendment 623 #
Proposal for a regulation
Article 37 – paragraph 2 – point c
Article 37 – paragraph 2 – point c
(c) any other information that the Coordinating Authority that sent the request, or the Commission, considers relevant, including, where appropriate, information gathered on its own initiative and suggestions for specific investigatory or enforcement measures to be taken.
Amendment 624 #
Proposal for a regulation
Article 37 – paragraph 3 – subparagraph 1
Article 37 – paragraph 3 – subparagraph 1
The Coordinating Authority of establishment shall assess the suspected infringement, taking into utmost account the request or recommendation referred to in paragraph 1.
Amendment 625 #
Proposal for a regulation
Article 37 – paragraph 3 – subparagraph 2
Article 37 – paragraph 3 – subparagraph 2
Where it considers that it has insufficient information to asses the suspected infringement or to act upon the request or recommendation and has reasons to consider that the Coordinating Authority that sent the request, or the Commission, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided.
Amendment 626 #
Proposal for a regulation
Article 37 – paragraph 4
Article 37 – paragraph 4
4. The Coordinating Authority of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation referred to in paragraph 1, communicate to the Coordinating Authority that sent the request, or the Commission, the outcome of its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and, where applicable, an explanationdetails of the investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation.
Amendment 636 #
Proposal for a regulation
Article 55 – paragraph 1 – point d a (new)
Article 55 – paragraph 1 – point d a (new)
(d a) a Survivors‘ Advisory Board as an advisory group, which shall exercise the tasks set out in Article 66a (new).
Amendment 637 #
Proposal for a regulation
Article 57 – paragraph 1 – point c
Article 57 – paragraph 1 – point c
(c) adopt rules for the prevention and management of conflicts of interest in respect of its members, as well as for the members of the Technological Committee and of any other advisory group it may establishthe Survivors’ Advisory Board and publish annually on its website the declaration of interests of the members of the Management Board;
Amendment 638 #
Proposal for a regulation
Article 57 – paragraph 1 – point f
Article 57 – paragraph 1 – point f
(f) appoint the members of the Technology Committee, and of any other advisory group it may establishthe Survivors’ Advisory Board;
Amendment 639 #
Proposal for a regulation
Article 57 – paragraph 1 – point h a (new)
Article 57 – paragraph 1 – point h a (new)
(h a) consult the Survivors’ Advisory Board as regards the obligations referred to in points (a) and (h) of this Article.
Amendment 640 #
Proposal for a regulation
Article 66 a (new)
Article 66 a (new)
Amendment 641 #
Proposal for a regulation
Article 83 – paragraph 1 – introductory part
Article 83 – paragraph 1 – introductory part
1. Providers of hosting services, providers of publicly available number- independent interpersonal communications services and providers of internet access services shall collect data on the following topics and make that information available to the EU Centre upon requestpublic:
Amendment 642 #
Proposal for a regulation
Article 83 – paragraph 1 – point a – introductory part
Article 83 – paragraph 1 – point a – introductory part
(a) where the provider has been subject to a detecn investigation order issued in accordance with Article 7:
Amendment 643 #
Proposal for a regulation
Article 83 – paragraph 1 – point a – indent 1
Article 83 – paragraph 1 – point a – indent 1
— the measures taken to comply with the order, including the technologies used for that purpose and the safeguards provided;
Amendment 644 #
Proposal for a regulation
Article 83 – paragraph 1 – point a – indent 2
Article 83 – paragraph 1 – point a – indent 2
— the error rates of false positives and false negatives the technologies deployed to detect online child sexual abuse and measures taken to prevent or remedy any errors;related to specific person, specific group of people or specific incident and steps taken to mitigate the harm caused by any inaccuracy
Amendment 646 #
Proposal for a regulation
Article 83 – paragraph 1 – point b
Article 83 – paragraph 1 – point b
(b) the number of removal orders issued to the provider in accordance with Article 14 and the average time needed for removing or disabling access to the item or items of child sexual abuse material in question , counting from the moment the order entered the provider’s system ;
Amendment 648 #
Proposal for a regulation
Article 83 – paragraph 1 – point b a (new)
Article 83 – paragraph 1 – point b a (new)
(b a) the number and duration of delays to removals as a result of requests from competent authorities or law enforcement authorities;
Amendment 649 #
Proposal for a regulation
Article 83 – paragraph 1 – point c
Article 83 – paragraph 1 – point c
(c) the total number of items of child sexual abuse material that the provider removed or to which it disabled access, broken down by whether the items were removed or access thereto was disabled pursuant to a removal order or to a notice submitted by a judicial authority, Competent Authority, the EU Centre or, a third partynational hotline, a trusted flagger or a private individual or at the provider’s own initiative;
Amendment 650 #
Proposal for a regulation
Article 83 – paragraph 1 – point c a (new)
Article 83 – paragraph 1 – point c a (new)
(c a) The number of instances the provider was asked to provide additional support to law enforcement authorities in relation to content that was removed;
Amendment 651 #
Proposal for a regulation
Article 83 – paragraph 1 – point d
Article 83 – paragraph 1 – point d
Amendment 652 #
Proposal for a regulation
Article 83 – paragraph 2 – introductory part
Article 83 – paragraph 2 – introductory part
2. The Coordinating Authorities shall collect data on the following topics and make that information publicly available redacting operationally sensitive data as appropriate and proving an unredacted version to the EU Centre upon request:
Amendment 653 #
Proposal for a regulation
Article 83 – paragraph 2 – point a – indent 4 a (new)
Article 83 – paragraph 2 – point a – indent 4 a (new)
- the nature of the report and its key characteristics such as if the security of the hosting service was allegedly breached;
Amendment 654 #
Proposal for a regulation
Article 83 – paragraph 2 – point b
Article 83 – paragraph 2 – point b
(b) the most important and recurrent risks of online child sexual abuse encountered , as reported by providers of hosting services and providers of publicly available number -independent interpersonal communications services in accordance with Article 3 or identified through other information available to the Coordinating Authority;
Amendment 655 #
Proposal for a regulation
Article 83 – paragraph 2 – point c
Article 83 – paragraph 2 – point c
(c) a list of the providers of hosting services and providers of interpersonal communications services to which the Coordinating Authority addressed a detecinvestigation order in accordance with Article 7;
Amendment 656 #
Proposal for a regulation
Article 83 – paragraph 2 – point d
Article 83 – paragraph 2 – point d
(d) the number of detecinvestigation orders issued in accordance with Article 7, broken down by provider and by type of online child sexual abuse, and the number of instances in which the provider invoked Article 8(3);
Amendment 657 #
Proposal for a regulation
Article 83 – paragraph 2 – point f
Article 83 – paragraph 2 – point f
(f) the number of removal orders issued in accordance with Article 14, broken down by provider, the time needed to remove or disable access to the item or items of child sexual abuse material concerned, , including the time it took the Coordinating Authority to process the order and the number of instances in which the provider invoked Article 14(5) and (6);
Amendment 658 #
Proposal for a regulation
Article 83 – paragraph 2 – point g
Article 83 – paragraph 2 – point g
Amendment 659 #
Proposal for a regulation
Article 83 – paragraph 3 – introductory part
Article 83 – paragraph 3 – introductory part
3. The EU Centre shall collect data and generate statistics on the detectioninvestigation order, reporting, removal of or disabling of access to online child sexual abuse under this Regulation. The data shall be in particular on the following topicsinclude:
Amendment 660 #
Proposal for a regulation
Article 83 – paragraph 3 – point a
Article 83 – paragraph 3 – point a
(a) the number of indicators in the databases of indicators referred to in Article 44 and the developmentchange of that number as compared to previous years;
Amendment 661 #
Proposal for a regulation
Article 83 – paragraph 3 – point b
Article 83 – paragraph 3 – point b
(b) the number of submissions of child sexual abuse material and solicitation of children referred to in Article 36(1), broken down by Member State that designated the submitting Coordinating Authorities, and, in the case of child sexual abuse material, the number of indicators generated on the basis thereof and the number of still active uniform resource locators included in the list of uniform resource locators in accordance with Article 44(3);
Amendment 662 #
Proposal for a regulation
Article 83 – paragraph 3 – point c
Article 83 – paragraph 3 – point c
(c) the total number of reports submitted to the EU Centre in accordance with Article 12, broken down by provider of hosting services and provider of publicly available number-independent interpersonal communications services that submitted the report and by Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3);
Amendment 663 #
Proposal for a regulation
Article 83 – paragraph 3 – point d
Article 83 – paragraph 3 – point d
(d) the online child sexual abuse to which the reports relate, including the number of items of potential known and new child sexual abuse material and instances of potential solicitation of children, the Member State the competent authority of which the EU Centre forwarded the reports to in accordance with Article 48(3), and type of relevant information society service that the reporting provider offers;
Amendment 664 #
Proposal for a regulation
Article 83 – paragraph 3 – point e
Article 83 – paragraph 3 – point e
(e) the number of reports that the EU Centre considered unfounded or manifestly unfounded, as referred to in Article 48(2);
Amendment 665 #
Proposal for a regulation
Article 83 – paragraph 3 – point f
Article 83 – paragraph 3 – point f
(f) the number of reports relating to potential newpreviously unknown child sexual abuse material and solicitation of children that were assessed as not constituting child sexual abuse material of which the EU Centre was informed pursuant to Article 36(3), broken down by Member State;
Amendment 666 #
Proposal for a regulation
Article 83 – paragraph 3 – point h
Article 83 – paragraph 3 – point h
(h) where materially the same item of potential child sexual abuse material was reported more than once to the EU Centre in accordance with Article 12 or detected more than once through the searches in accordance with Article 49(1), the number of times that that item was reported or detected in that manner.
Amendment 667 #
Proposal for a regulation
Article 83 – paragraph 4
Article 83 – paragraph 4
4. The providers of hosting services, providers of interpersonal communications services and providers of internet access services, the Coordinating Authorities and the EU Centre shall ensure that the data referredstored pursuant to in paragraphs 1, 2 and 3, respectively, is stored no longer than is necessary for the transparency reporting referred to in Article 84. The data stored shall not contain any personal data.
Amendment 668 #
Proposal for a regulation
Article 83 – paragraph 5
Article 83 – paragraph 5
5. They shall ensure that the data is stored in a secure manner and that the storage is subject to appropriate technical and organisational safeguards. Those safeguards shall ensure, in particular, that the data can be accessed and processed only for the purpose for which it is stored, that a high level of security is achieved and that the information is deleted when no longer necessary for that purpose. All access to this data shall be logged and the logs securely stored for five years. They shall regularly review those safeguards and adjust them where necessary.