98 Amendments of Lena DÜPONT related to 2020/0361(COD)
Amendment 28 #
Proposal for a regulation
Recital 2
Recital 2
(2) Up till now, politics has relied on voluntary cooperation with a view to address these risks and challenges. Since this has proved insufficient and there has been a lack of harmonised rules at Union level, Member States arehave been increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross- border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice.
Amendment 30 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. the gender equality principle and non- discrimination. In order to exercise these rights, the online world needs to be a safe space, especially for women and girls, where everybody can move freely. Therefore, measures to protect from, and prevent, phenomena such as online violence, cyberstalking, harassment, hate speech and exploitation of women and girls are essential.
Amendment 34 #
Proposal for a regulation
Recital 5
Recital 5
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. Given that online platforms are part of our everyday-life and have become indispensable, even more so since the pandemic, the spread of illegal and harmful content, such as child sexual abuse material, online sexual harassment, unlawful non-consensual sharing of private images and videos, cyber violence, has risen dramatically as well. Ensuring a safe space online implies targeted actions against all phenomena harmfully affecting our social life, including through an awaited proposal on how to deal with harmful but not illegal content online. _________________ 26Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
Amendment 35 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/..2021/784 of the European Parliament and of the Council29 – proposed Terand the recently adopted Regulation of the European Parliament and of the Council on a temporary derogation from certain prorvist Content Online Regulationions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number- independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/..2021/784 of the European Parliament and of the Council – proposed Tof 29 April 2021 on addressing the dissemination of terrorist Ccontent Oonline Regulation(OJ L 172, 17.5.2021, p. 79).
Amendment 37 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly anin order to underpin the general idea that what is illegal offline should also be illegal online. The concept should also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech, child sexual abuse material or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as trafficking in human beings, sexual exploitation of women and girls, the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images and videos, online stalking, grooming adolescents, online sexual harassment and other forms of gender based violence, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 47 #
Proposal for a regulation
Recital 26 a (new)
Recital 26 a (new)
(26 a) Being aware that the intermediary services have already applied a risk assessment, there is still potential for improvement for the security and safety of all users, especially children, women, and other vulnerable groups. Therefore providers of intermediary services, more precisely online platforms and very large online platforms, shall regularly evaluate their risk assessment and, if found necessary, improve it. Given the importance of providers of intermediary services and their potential to impact social life, common rules determining how users shall behave online, should be applied.The implementation of a code of conduct should be obligatory for every provider of intermediary services covered by this Regulation.
Amendment 48 #
Proposal for a regulation
Recital 30
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679, the recently adopted Regulation of the European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. Member States should ensure that the competent authorities fulfil their tasks in an objective, independent and non-discriminatory manner. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content2021/784 addressing the dissemination of terrorist content online, the recently adopted Regulation of the European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
Amendment 52 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and, women and girls, as well as vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 53 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. Providers offering their services in more than one Member State should provide a breakdown of the information by Member State. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40Aligned with the annual reports broken down by actions of content moderation and Member State, the results of all forms of violence against women and girls online, hate speech and of other illegal content should reappear in the crime statistics. All forms of violence against women and girls shall be reported as an own category in those criminal statistics and law enforcement entities shall list them separately. _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 57 #
Proposal for a regulation
Recital 41
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the gender equality principle and the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
Amendment 63 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising reproducing stereotypical content with an impact on the equal treatment and opportunities of citizens against the gender equality principle. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 66 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material, unlawful non-consensual sharing of private images and videos, online stalking or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the gender equality principle with the right to non-discrimination and the rights of the child. The social dimension, as online platforms play a major role in our everyday-life, is also affected by phenomena as online harassment and cyber violence. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform, including when algorithms are misinformed causing widening of gender gaps, or the misuse of their service through the submission of abusive notices or other methods for silencing speech, causing harm, such as long term mental health damage, psychological damage and societal damage, or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 77 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should be obliged to regularly review their algorithms to minimise such negative consequences and should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoyhave alternative options for the main parameters, including options that are not based on profiling of the recipienta visible, user-friendly and readily available option to turn off algorithmic selection with the recommender system entirely and options that are not based on profiling of the recipient. The gender- based algorithm bias must be prevented to avoid discriminatory impact on women and girls.
Amendment 84 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as gender equality, including equality between women and men, and non- discrimination and non- discrimination, eradicating all forms of violence against women and girls, including online violence, harassment and sexual exploitation, online stalking, child abuse, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 89 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including equality, are effectively protected.
Amendment 90 #
Proposal for a regulation
Article 1 – paragraph 5 – point d
Article 1 – paragraph 5 – point d
(d) Regulation (EU) …/…. on prevent2021/784 on addressing the dissemination of terrorist content online [TCO once adopted];
Amendment 91 #
Proposal for a regulation
Article 1 – paragraph 5 – point d a (new)
Article 1 – paragraph 5 – point d a (new)
(d a) Regulation of the European Parliament and of the European Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online
Amendment 98 #
Proposal for a regulation
Article 12 – title
Article 12 – title
Terms and conditions, code of conduct
Amendment 101 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, non-discriminatory and transparent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 103 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2 a. Providers of intermediary services shall be obliged to include on their platforms a code of conduct, setting out behavioral rules for their users. These rules shall be publicly accessible in an easy format and shall be set out in clear and unambiguous language.
Amendment 107 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include breakdowns at Member States level and, in particular, information on the following, as applicable:
Amendment 126 #
Proposal for a regulation
Recital 2
Recital 2
(2) For years, politics has relied on voluntary cooperation with a view to address these risks and challenges. Since this has proved insufficient and there has been a lack of harmonised rules at Union level, Member States arehave been increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross- border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. Moreover, a fragmentation of rules can have negative consequences for the freedom of expression.
Amendment 132 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persusers and consumers within the Unions to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, the rights to privacy and data protection, and the right to non-discrimination.
Amendment 134 #
Proposal for a regulation
Recital 8
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States canshould be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, the mere technical accessibility of a website, of an email address or of other contact details from the Union, cannot, on that ground alone, be considered as establishing a substantial connection to the Union. _________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1)sufficient to constitute a substantial connection to the Union.
Amendment 135 #
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, gender equality, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 136 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/..2021/784 of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 138 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content or the content that risks an increase in online violence and of information that is incompatible with their terms and conditions.
Amendment 142 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech, child sexual abuse material or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non- compliant or counterfeit products, the non- authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 148 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined personAccordingly, that information should be considered to be disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, should fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 148 #
(b) targeted measures aimed at limiting the display of advertisements or harmful content in association with the service they provide;
Amendment 161 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against, removing and reporting illegal content that providers of intermediary services may undertake on a voluntary basis, including the deployment of automated tools to detect manifestly illegal content, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union or national law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 165 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services, cloud infrastructure providers and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 166 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation in relation to illegal content. However, notice-and-action mechanisms should be complemented by requirements for providers to take proactivespecific measures to relation to illegal contenthat are proportionate to their scale of reach as well as their technical and operational capacities in order to effectively address the appearance of illegal content on their services.
Amendment 170 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial, law enforcement or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
Amendment 172 #
Proposal for a regulation
Recital 30
Recital 30
(30) Orders to act against illegal content or to provide information should be issued by designated national competent authorities in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. Member States should ensure that the competent authorities fulfil their tasks in an objective, independent and non- discriminatory manner and do not seek or take instructions from any other body in relation to the exercise of the tasks under this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulati2021/784 on addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
Amendment 173 #
Proposal for a regulation
Recital 30 a (new)
Recital 30 a (new)
(30 a) In line with the judgment of the Court of Justice of 3 October 2019 in case C-18/18 and where technologically feasible, providers of intermediary services may be required, on the basis of sufficiently substantiated orders by designated competent authorities and taking full account of the specific context of the content, to execute periodic searches for distinct pieces of content that a court has already declared unlawful, provided that the monitoring of and search for the information concerned by such an injunction are limited to information conveying a message whose content remains essentially unchanged compared with the content which gave rise to the finding of illegality and containing the elements specified in the injunction, which, are identical or equivalent to the extent that would not require the host provider to carry out an independent assessment of that content.
Amendment 174 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial, law enforcement or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. Providers of intermediary services should not be legally required to remove content which is legal in their country of establishment. However, in accordance with Union law, it should be possible for a competent authority to request a provider established or legally represented in another Member State to block access to specific content from the Union territory. This is without prejudice to the right for providers to check specific content subject to an order against their terms and conditions and subsequently remove it despite it being legal in their country of establishment.
Amendment 180 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives,empowered to act on their behalf so as to allow for the compliance, effective oversight and, where necessary, enforcement of this Regulation in relation to. It should be possible for intermediary services to designate, for those providers. It should be possibleurposes of this Regulation, a legal representative already designated for other purposes, provided that that legal representative to alsois able to fulfil the function as point of contact, provided the relevanrovided for in this Regulation. Providers of intermediary services that arequirements of this Regulation are complied with. part of a group should be allowed to collectively designate one legal representative.
Amendment 181 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. Providers offering their services in more than one Member State should provide a breakdown of the information by Member State. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 184 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considerassesses to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider, based on its own assessment, can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 191 #
(42) Where a hosting service provider decides to remove or disable access to information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated meantools, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 202 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 or in Directive 2017/571 of the European Parliament and the Council. In such instances, the online platform should inform without delay the competent law enforcement authorities or Europol in cases where the competent law enforcement authority cannot easily be identified, of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 218 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fthe fundamental Rrights, including the to freedom of expression and information, the right to private life, the right to non- discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on healthpublic health, education, civic discourse, electoral processes, public safety and security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 222 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary and proportionate means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as labelling of content shared by bots or removing fake accounts, improving the visibility of authoritative information sources and offering corrections whenever possible. Very large online platforms mayare asked to reinforce their internal processes or supervision of any of their activities, in particular as regards the detection and resolution of systemic risks. They may alsare asked to initiate or increase cooperation with trusted flaggers and independent fact-checkers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potentialwith a view to limiting the negative effects on the fundamental rights of all parties involved, notably the recipients of the service.
Amendment 231 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. Where applicable, the report should include a description of specific elements that could not be audited, and an explanation of why these could not be audited. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. Where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such a conclusive opinion should be added.
Amendment 233 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoyhave alternative options for the main parameters, including a visible, user-friendly and readily available option to turn off algorithmic selection within the recommender system entirely and options that are not based on profiling of the recipient.
Amendment 240 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment, the Digital Services Coordinators of destination or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researcherscientific researchers meeting specific requirements, for the sole purpose of conducting research that contributes to the identification, understanding and mitigation of systemic risks. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate public and private interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. The vetted researchers should be required to make the results of their research publicly available, taking into account the rights and interests of the recipients of the service concerned, notably under Regulation (EU) 2016/679.
Amendment 242 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate, reported per individual Member State and appropriately protect the rights and legitimate public and private interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 257 #
Proposal for a regulation
Recital 84
Recital 84
(84) The Digital Services Coordinator should regularly publish a report on the activities carried out under this Regulation. Given that the Digital Services Coordinator is also made aware of orders to take action against illegal content or to provide information regulated by this Regulation through the common information sharing system, the Digital Services Coordinator should include in its annual report the number and categories of these orders addressed to providers of intermediary services issued by judicial, law enforcement and administrative authorities in its Member State.
Amendment 272 #
Proposal for a regulation
Article 1 – paragraph 1 – point c
Article 1 – paragraph 1 – point c
(c) rules on the implementation and enforcement of the requirements set out in this Regulation, including as regards the cooperation of and coordination between the competent authorities.
Amendment 318 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry outapply voluntary measures on their own- initiative investigations oror carry out other activities aimed at detecting, identifying, reporting and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements ofset out in national or Union law, including those set out in this Regulation.
Amendment 331 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevaa competent national judicial or administrative authoritiesy, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 337 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
Article 8 – paragraph 2 – point a – indent 1
— a sufficient statement of clear reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed;
Amendment 338 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
Article 8 – paragraph 2 – point a – indent 3
— user-friendly information about redress available to the provider of the service and to the recipient of the service who provided the content, including information about redress to the competent authority, recourse to a court, as well as the deadlines for appeal;
Amendment 342 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
Article 8 – paragraph 2 – point a – indent 3 a (new)
- identification details of the competent authority issuing the order and authentication of the order by that competent authority;
Amendment 344 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 b (new)
Article 8 – paragraph 2 – point a – indent 3 b (new)
- a reference to the legal basis for the removal order;
Amendment 345 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 c (new)
Article 8 – paragraph 2 – point a – indent 3 c (new)
Amendment 346 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 d (new)
Article 8 – paragraph 2 – point a – indent 3 d (new)
- where necessary and proportionate, the decision not to disclose information about the removal of or disabling of access to the content for reasons of public security, such as the prevention, investigation, detection and prosecution of serious crime, for as long as necessary, but not exceeding six weeks from that decision
Amendment 355 #
Proposal for a regulation
Article 8 – paragraph 3
Article 8 – paragraph 3
3. The Digital Services Coordinator from the Member State of the judicial or administrativecompetent authority issuing the order shall, without undue delay, transmit a copy of the orders referred to in paragraph 1 to all other Digital Services Coordinators through the system established in accordance with Article 67.
Amendment 366 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
Article 9 – paragraph 2 – point a – indent 1
— a sufficient statement of clear reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to public security, such as the prevention, investigation, detection and prosecution of criminal offences;
Amendment 370 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
Article 9 – paragraph 2 – point a – indent 2
— user-friendly information about redress available to the provider and to the recipients of the service concerned, including information about redress to the competent authority, recourse to a court, as well as the deadlines for appeal;
Amendment 373 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 a (new)
Article 9 – paragraph 2 – point a – indent 2 a (new)
- identification details of the competent authority issuing the order and authentication of the order by that competent authority;
Amendment 374 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 b (new)
Article 9 – paragraph 2 – point a – indent 2 b (new)
- a reference to the legal basis for the removal order;
Amendment 375 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 c (new)
Article 9 – paragraph 2 – point a – indent 2 c (new)
- the date, time stamp and electronic signature of the competent authority issuing the removal order;
Amendment 403 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective and proportionatenon-discriminatory and transparent manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 417 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include breakdowns at Member state level and, in particular, information on the following, as applicable:
Amendment 419 #
Proposal for a regulation
Article 13 – paragraph 1 – point a a (new)
Article 13 – paragraph 1 – point a a (new)
(a a) the complete number of content moderators allocated for individual official languages per Member State and a qualitative description of how automated tools for content moderation are used for content moderation in each official language
Amendment 424 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 (new)
Article 13 – paragraph 1 – subparagraph 1 (new)
Providers of intermediary services shall provide an accessible database of removed harmful content. This database shall be user-friendly and it shall include the reasons why the content was removed.
Amendment 468 #
Proposal for a regulation
Article 15 – paragraph 1 a (new)
Article 15 – paragraph 1 a (new)
1 a. Providers of hosting may choose to make use of ex ante control measures based on automated tools for content moderation, notably to prevent the upload of specific content which has been declared illegal by a court. Where providers of hosting services otherwise use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that the staff is provided with adequate training on the applicable legislation and, where necessary, with access to professional support, qualified psychological assistance and qualified legal advice.
Amendment 469 #
Proposal for a regulation
Article 15 – paragraph 1 b (new)
Article 15 – paragraph 1 b (new)
1 b. Providers of hosting services shall act in a transparent, coherent, predictable, non-discriminatory, diligent and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 473 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means used in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
Amendment 478 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
Article 15 – paragraph 4 a (new)
4 a. The obligations pursuant to this Article shall not apply where the provider can demonstrate that the recipient of the service has repeatedly provided illegal content or where the removal is based on an order in accordance with Article 8 and the competent authority that issued the order decides that it is necessary and proportionate that there be no disclosure for reasons of public security, such as the prevention, investigation, detection and prosecution of terrorist offences, for as long as necessary, but not exceeding six weeks from that decision. In such a case, the hosting service provider shall not disclose any information on the removal or disabling of access to terrorist content. That competent authority may extend that period by a further six weeks, where such non-disclosure continues to be justified.
Amendment 568 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year; broken down per Member State;
Amendment 571 #
Proposal for a regulation
Article 20 – paragraph 3 – point b
Article 20 – paragraph 3 – point b
(b) the relative proportion thereofof manifestly unfounded notices or complaints in relation to the total number of items of information provided or notices submitted in the past year broken down per Member State;
Amendment 573 #
Proposal for a regulation
Article 20 – paragraph 4
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner,user-friendly language their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
Amendment 619 #
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability anyd significeverity of anty systemic risks stemming from the design, functioning and use made of their services in the Union including disproportionate systemic risks at the level of Member State. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 629 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional or coordinated manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, education, minors, civic discourse, risk of deception or manipulation of users and consumers or actual or foreseeable effects related to electoral processes and public safety and security.
Amendment 635 #
Proposal for a regulation
Article 27 – title
Article 27 – title
27 Mitigation ofSpecific measures to mitigate risks
Amendment 637 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. VWithout prejudice to the due diligence requirements set out in Chapter III of this Regulation, very large online platforms shall put in place reasonablappropriate, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, whereas applicable:
Amendment 644 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems and online interfaces, their decision- making processes, the features or functioning of their services, or their terms and conditions;
Amendment 645 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems and online interfaces, their decision- making processes, the features or functioning of their services, or their terms and conditions;
Amendment 648 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at discontinuing or at least limiting the display of advertisements in association with the service they providefor specific content;
Amendment 652 #
Proposal for a regulation
Article 27 – paragraph 1 – point c
Article 27 – paragraph 1 – point c
(c) reinforcing the internal processes or supervision of any of their activities in particular as regards detection and resolution of systemic risk;
Amendment 657 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
Article 27 – paragraph 1 – point e
(e) initiating or adjusting cooperation with other online platforms through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively as well as other relevant self- regulatory measures.
Amendment 667 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrenteach of the systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 673 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, mayshall issue general guidelines on the application of paragraph 1 in relation to specific systemic risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
Amendment 680 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
Amendment 683 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27
Amendment 700 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
Article 28 – paragraph 3 – point f a (new)
(f a) where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such a conclusive opinion.
Amendment 703 #
Proposal for a regulation
Article 28 – paragraph 3 – point f b (new)
Article 28 – paragraph 3 – point f b (new)
(f b) a description of specific elements that could not be audited, and an explanation of why these could not be audited.
Amendment 711 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as annd they shall provide clear and user-friendly options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.”
Amendment 718 #
Proposal for a regulation
Article 29 – paragraph 2
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible and user-friendly functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them including the option not to apply any recommender systems and to have the content shown in chronological order.
Amendment 740 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and, understanding and mitigation, of systemic risks as set out in Articles 26(1). and 27
Amendment 747 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, scientific researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 751 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following twohree reasons:
Amendment 755 #
Proposal for a regulation
Article 31 – paragraph 6 – point b a (new)
Article 31 – paragraph 6 – point b a (new)
(b a) insofar as personal data is concerned, giving access to the data would violate applicable Union or national data protection law.
Amendment 757 #
Proposal for a regulation
Article 31 – paragraph 7 a (new)
Article 31 – paragraph 7 a (new)
7 a. Upon completion of the research envisaged in Article 31(2), the vetted researchers shall make their research publicly available, taking into account the rights and interests of the recipients of the service concerned in compliance with Regulation (EU) 2019/679.
Amendment 779 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2