138 Amendments of Pilar DEL CASTILLO VERA related to 2020/0361(COD)
Amendment 81 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. The Commission and the Member States should provide guidance to on how to identify the illegal content.
Amendment 94 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
Amendment 105 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that contenassess the grounds for and, when necessary, proceed to removing or disabling access to all copies of that content, and, in accordance with the jurisprudence of the Court of Justice of the European Union, ensure that identical or equivalent illegal content does not reappear within the same context. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 112 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage automated or non- automated activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner for the sole purpose of detecting, identifying and acting against illegal content. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union or national law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability set out in this Regulation. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 121 #
Proposal for a regulation
Recital 30
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The orders to act against illegal content may require providers of intermediary services to take steps, in the specific case, to remove identical or equivalent illegal content, within the same context. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
Amendment 139 #
Proposal for a regulation
Recital 42 a (new)
Recital 42 a (new)
(42 a) Hosting services providers should not be subject to the obligation to provide a statement of reasons when doing so would infringe on a legal right or cause unintended safety concerns for the recipient of the service. Specifically in cases of one-to-one interface platforms, such as dating applications and other similar services, providing the statement of reasons should be considered such as to likely cause unintended safety concerns for the reporting party. As a result of this, dating applications and other similar services should by default refrain from providing statements of reasons. Additionally, other providers of hosting services should make reasonable efforts to assess if providing a statement of reasons could cause unintended safety concerns to the reporting party, and in such cases, refrain from providing a statement of reasons.
Amendment 142 #
Proposal for a regulation
Recital 43 a (new)
Recital 43 a (new)
(43 a) To similarly avoid unnecessary regulatory burden, certain obligations should not apply to online platforms offering products and services from third- party traders, which are established in the European Union, where these traders' access is exclusive, curated and entirely controlled by the providers of the online platform and these traders’ products and services are reviewed and pre-approved by the providers of the online platform before they are offered on the platform. These online platforms are often referred to as closed online platforms. As the products and services offered are reviewed and pre-approved by the online platforms, the prevalence of illegal content and products on these platforms is low, and these platforms cannot benefit from relevant liability exemptions outlined in this Regulation. These online platforms should subsequently not be subjected to the obligations which are necessary for platforms with different operational models where the prevalence of illegal content is more frequent and the relevant liability exemptions are available.
Amendment 151 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms, having received guidance from public authorities on how to identify illegal content, take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 157 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders on the platforms should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 161 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot, as this would be disproportionate. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties or be liable for this information in case it proves to be inaccurate. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 175 #
Proposal for a regulation
Recital 54
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. In certain cases, online platforms whose number of recipients does not exceed the operational threshold set at 10% of the Union population should also be considered very large online platforms due to their role in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online.
Amendment 185 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. A disclaimer of an opinion should be given where the auditor does not have enough information to conclude on an opinion due to the novelty of the issues audited.
Amendment 186 #
Proposal for a regulation
Recital 2 a (new)
Recital 2 a (new)
(2a) Moreover, complex national regulatory requirements, fragmented implementation and insufficient enforcement of legislation such as Directive 2000/31/EC have contributed to high administrative costs and legal uncertainty for intermediary services operating on the internal market, especially micro, small and medium sized companies.
Amendment 213 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provisionfully harmonises the rules applicable to intermediary services in the internal market with the objective to ensure a safe and trusted online environment, effective protection of fundamental rights and a favourable business climate. Accordingly, Member States should not adopt or maintain additional national requirements on those matters falling within the scope of this Regulation. This does not preclude the possibility to apply other national legislation applicable to providers of intermediary services, in particular Directive 2000/31/ECaccordance with Union law, including Directive 2000/31/EC, in particular its Article 3, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. . __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 225 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3 a (new)
Article 2 – paragraph 1 – point f – indent 3 a (new)
- Providers of not-for-profit scientific or educational repositories are not considered an intermediary service within the meaning of this Regulation.
Amendment 226 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that conceptFor the purpose of this Regulation the concept of “illegal content” should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relateit is not in compliance with Union law as it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 249 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Services, such as internet infrastructure services or cloud service providers, which are provided at the request of parties other than the content providers and only indirectly benefitting the latter, should not be covered by the definition of online platforms. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 257 #
Proposal for a regulation
Recital 17
Recital 17
(17) The relevant rules of Chapter II should only establish when the provider of intermediary services concerned cannot be held liable in relation to illegal content provided by the recipients of the service. Those rules should notby no means be understood to provide a positive basis for establishing when a provider can be held liable, which is for the applicable rules of Union or national law to determine. Furthermore, the exemptions from liability established in this Regulation should apply in respect of any type of liability as regards any type of illegal content, irrespective of the precise subject matter or nature of those laws.
Amendment 266 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry outtake the necessary voluntary own-initiative investigation measures for other activities aimed at sole purpose of detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.
Amendment 281 #
Proposal for a regulation
Recital 22 a (new)
Recital 22 a (new)
(22a) The exemption of liability should not apply where the recipient of the service is acting under the authority or the control of the provider of a hosting service. In particular, where the provider of the online platform that allows consumers to conclude distance contracts with traders does not allow traders to determine the basic elements of the trader-consumer contract, such as the terms and conditions governing such relationship or the price, it should be considered that the trader acts under the authority or control of that platform.
Amendment 282 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders as a functionality of their service, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. This is the case where the online platform operator fails to clearly display the identity of the trader following this Regulation. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. In particular, it is relevant whether the online platform operator withholds such identity or contract details until after the conclusion of the trader- consumer contract, or is marketing the product or service in its own name rather than using the name of the trader who will supply it.
Amendment 283 #
5 a. Providers of intermediary services that qualify as micro, small or medium- sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation.
Amendment 310 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, cloud infrastructure services or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 315 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. , small or medium-sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a micro, small or medium-sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 319 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices at scale and exclusively by electronic means.
Amendment 329 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic loidentification of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 331 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. Since intermediaries should not be required to remove information which is legal in their country of establishment, national and Union authorities should be able to order the blocking of content legally published outside the Union only for the territory of the Union where Union law is infringed and for the territory of the issuing Member State where national law is infringed.
Amendment 344 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. Where a provider of hosting services processes a notice and decides to remove or disable access to specific items of information provided by the recipients of the service, it shall take steps, in the specific case, to remove identical or equivalent illegal content, within the same context.
Amendment 346 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
Article 14 – paragraph 6 b (new)
6 b. Paragraphs 2, 4 and 5 shall not apply to providers of intermediary services that qualify as micro, small or medium- sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraphs 2, 4 and 5 shall not apply to enterprises that previously qualified for the status of a micro, small or medium-sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 348 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of thewithout undue delay and at latest within 24 hours after such removaling or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 352 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
Amendment 358 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
Amendment 359 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
Article 15 – paragraph 4 a (new)
Amendment 360 #
Proposal for a regulation
Article 15 – paragraph 4 b (new)
Article 15 – paragraph 4 b (new)
4 b. Providers of hosting services shall not be obliged to provide a statement of reasons referred to in paragraph 1 where doing so would infringe a legal obligation or where the statement of reasons could cause unintended safety concerns for the reporting party. In addition, providers of hosting services shall not be obliged to provide a statement of reasons referred to in paragraph 1 where the provider can demonstrate that the recipient of the service has repeatedly provided illegal content
Amendment 362 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15 a Protection against repeated misuse and criminal offences 1. Providers of intermediary services shall, after having issued a prior warning, suspend or in appropriate circumstances terminate the provision of their services to recipients of the service that frequently provide illegal content. 2. Where a provider of intermediary service becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. Where the provider of intermediary service cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it has its main establishment or has its legal representative and also transmit this information to Europol for appropriate follow-up.
Amendment 362 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. Obligations related to terms and conditions should not oblige a provider of an intermediary service to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets or intellectual property rights.
Amendment 373 #
Proposal for a regulation
Article 16 – paragraph 1 b (new)
Article 16 – paragraph 1 b (new)
This Section shall not apply to online platforms offering products and services from third-party traders, which are established in the European Union, where these traders' access is exclusive, curated and entirely controlled by the providers of the online platform and these traders’ products and services are reviewed and pre-approved by the providers of the online platform before they are offered on the platform.
Amendment 376 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action')following the applicable law ('action'). Such mechanisms should be clearly visible on the interface of the hosting service and easy to use. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. Providers of hosting services could, as a voluntary measure, conduct own-investigation measures to prevent content which has previously been identified as illegal from being disseminated again once removed. The obligations related to notice and action should by no means impose general monitoring obligations.
Amendment 388 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
Amendment 391 #
1. RAfter internal complaint handling mechanisms are exhausted, recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of- court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 401 #
Proposal for a regulation
Recital 43
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or, small or medium sized enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. __________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 403 #
Proposal for a regulation
Recital 43 a (new)
Recital 43 a (new)
(43a) To similarly avoid unnecessary regulatory burdens, certain obligations should not apply to hosting service providers often referred to as closed online platforms where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
Amendment 405 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift, non- discriminatory and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint- handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effectivimple, affordable, expedient and accessible manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 417 #
Proposal for a regulation
Article 19 – paragraph 7 a (new)
Article 19 – paragraph 7 a (new)
7 a. Online platforms shall, where possible, provide trusted flaggers with access to technical means that help them detect illegal content on a large scale.
Amendment 417 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, depending on the severity of the illegal activity, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and private or semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressionscontent online. For intellectual property rights, organisations of industry and of individual right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 422 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, or in appropriate circumstances terminate, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
Amendment 427 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 430 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
Amendment 434 #
Proposal for a regulation
Article 20 – paragraph 4
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension, and the circumstances in which they will terminate their services.
Amendment 437 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is has its main establishedment or has its legal representative or inform Europoland also transmit this information to Europol for appropriate follow-up.
Amendment 442 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with tradersprofessional traders on the platform, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained from the trader the following information:
Amendment 445 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 ,and the Union Rapid Alert System for dangerous non-food products (Rapex) or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . __________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 448 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
Article 22 – paragraph 1 – point c
Amendment 450 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
Article 22 – paragraph 1 – point d
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law; _________________ 51o the extent the contract relates to products that are subject to the Union Regulations listed in Article 4(5) of Regulation (EU) 2019/1020 of the European Parliament and the Council, the name, address, telephone number and electronic mail address of the economic operator, established in the Union, referred to in Article 4(1) of Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).r any relevant act of Union law;
Amendment 455 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, mtake reasonable effortseffective steps that would reasonably be taken by a diligent operator in accordance with a high industry standard of professional diligence to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is accurate, current and reliable through the use of independent and reliable sources including any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. The provider of intermediary services should require that traders promptly inform them of any changes to the information referred to in points (a), (d) and (e) and regularly repeat this verification process.
Amendment 455 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However,Online advertising is a significant source of financing for many digital business models and an effective tool to reach new customers, not least for small- and medium sized companies. However, there are some instances when online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. To ensure consumer protection online advertisement should be subject to proportionate and meaningful transparency obligations. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 462 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Article 22 – paragraph 3 – subparagraph 1
3. Where the online platform obtains indications, through its effective steps that would reasonably be taken by a diligent operator under paragraph 2 or through Member States’ consumer authorities, that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate, out of date or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
Amendment 478 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
Amendment 490 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delayin 30 days following its adoption, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
Amendment 500 #
Proposal for a regulation
Recital 63
Recital 63
(63) Advertising systems used by very large online platforms could pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
Amendment 506 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3 or where the operating model and nature of the platform is considered to constitute a systemic risk assessed calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. This Section shall not apply to online platforms that qualify as micro, small or medium-sized enterprises (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, this Section shall not apply to enterprises that previously qualified for the status of a medium-sized, small or microenterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 510 #
Proposal for a regulation
Article 25 – paragraph 3
Article 25 – paragraph 3
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union or whether the operating model and nature of platform constitutes a systemic risk, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features, as well as how to determine whether operating model and size of platform is considered such as to constitute a systemic risk.
Amendment 511 #
Proposal for a regulation
Article 25 – paragraph 3 a (new)
Article 25 – paragraph 3 a (new)
3 a. The delegated acts referred to in paragraph 3 shall base the methodology on the following criteria: a) the role of the online platform in facilitating public debate; b) the role, nature and volume of economic transactions on the online platform; c) the role of the online platform in disseminating information, opinions and ideas and in influencing how recipients of the service obtain and communicate information online; and d) the depth and scope of the societal risks posed by the platform, as well as the historical prevalence of illegal content on the service. Online platforms, regardless of the number of average monthly active recipients of their service in the Union, that pose a high systemic risk based on an assessment following the criteria outlined in this paragraph, shall be considered to be very large online platforms.
Amendment 512 #
Proposal for a regulation
Article 25 – paragraph 4 – subparagraph 1
Article 25 – paragraph 4 – subparagraph 1
4. The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipients of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1, or whether the operating model and nature of platform constitutes a systemic risk. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission.
Amendment 520 #
Proposal for a regulation
Recital 68
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fakintentionally inaccurate or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerablecertain groups of recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
Amendment 582 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned; and have not provided any other service to the platform in the previous 12 months.
Amendment 584 #
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
Article 28 – paragraph 2 – point c a (new)
(c a) have not audited the same very large online platform for more than 3 consecutive years.
Amendment 587 #
Proposal for a regulation
Article 28 – paragraph 3 – point f
Article 28 – paragraph 3 – point f
(f) where the audit opinion is not positive, operationalegative, recommendations on specific measures to achieve compliance and risk-based remediation timelines with a focus on rectifying issues that have the potential to cause most harm to users of the service as a priority.
Amendment 588 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
Article 28 – paragraph 3 – point f a (new)
(f a) where the organisations that perform the audits do not have enough information to conclude an opinion due to the novelty of the issues audited, a relevant disclaimer.
Amendment 591 #
Proposal for a regulation
Article 28 – paragraph 4 a (new)
Article 28 – paragraph 4 a (new)
4 a. Digital Services Coordinators shall provide very large online platforms under their jurisdiction with an annual audit plan outlining the key areas of focus for the upcoming audit cycle.
Amendment 613 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 614 #
Proposal for a regulation
Article 1 – paragraph 2 – point b – point i (new)
Article 1 – paragraph 2 – point b – point i (new)
i) facilitate innovations, support digital transition, encourage economic growth and create a level playing field for digital services within the internal market while strengthening consumer protection and contributing to increased consumer choice.
Amendment 616 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, for advertisements that have been seen by more than 5 000 recipients of the service and until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 620 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.
Amendment 686 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 697 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
Amendment 728 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
Amendment 787 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry outtake the necessary voluntary own-initiative investigations or other activiti measures aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation, without prejudice to freedom of expression.
Amendment 790 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Article 6 – paragraph 1 a (new)
Providers of intermediary services shall ensure that such measures are accompanied with appropriate safeguards, such as oversight, documentation and traceability or additional measures to ensure that own- initiative investigations are accurate, legally justified and do not lead to over- removal of content.
Amendment 897 #
Proposal for a regulation
Article 10 – title
Article 10 – title
Points of contact for authorities, the Commission and the Board
Amendment 908 #
Proposal for a regulation
Article 10 a (new)
Article 10 a (new)
Article 10a Point of contact for recipients of a service 1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with the recipients of their services. The means of communication shall be user-friendly and easily accessible. 2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact for recipients.
Amendment 918 #
Proposal for a regulation
Article 11 – paragraph 4 a (new)
Article 11 – paragraph 4 a (new)
4a. Providers of intermediary services that would qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC if established in the Union, and who have been unsuccessful in designating a legal representative after reasonable efforts, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including the possibility for collective representation.
Amendment 925 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including information about algorithmic decision-making and human review. ItProviders of intermediary services shall also include information on the right to terminate the use of the service. The possibility to terminate must be easily accessible for the user. Information on remedies and redress mechanisms shall also be included in the terms and conditions. The terms and conditions shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 950 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Obligations pursuant to paragraph 1 and 2 should not oblige a provider of an intermediary service to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets or intellectual property rights.
Amendment 989 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
Article 13 – paragraph 1 – point c
(c) meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures;
Amendment 1009 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
Article 13 – paragraph 2 a (new)
2a. Paragraph 1 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
Amendment 1060 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 on the basis of which a diligent provider of hosting services is able to assess the illegality of the content in question, shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
Amendment 1064 #
Proposal for a regulation
Article 14 – paragraph 4
Article 14 – paragraph 4
4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly, without undue delay, send a confirmation of receipt of the notice to that individual or entity.
Amendment 1081 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Providers of hosting services could, as a voluntary measure in line with provisions Article 6, conduct own- investigation measures to prevent illegal content which has previously been identified as illegal from being disseminated again once removed. The obligations related to paragraph 1 to 6 shall by no means impose general monitoring obligations on hosting services.
Amendment 1089 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
Article 14 – paragraph 6 c (new)
6c. Paragraph 2 and 4-5 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
Amendment 1096 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to or radically restrict the visibility of specific items of information provided by the recipients of the service, or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or removing or disabling access to or for restricting the visibility or monetisation of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time ofwithout undue delay and at the latest within 24 hours after the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 1102 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the or radical restriction of the visibility of, the information or the suspension or termination of monetary payments related to that information and, where relevant, the territorial scope of the disabling of access;
Amendment 1124 #
Proposal for a regulation
Article 15 – paragraph 4 b (new)
Article 15 – paragraph 4 b (new)
4b. Paragraph 2 to 4 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
Amendment 1129 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 1157 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or not to remove or disable access to the information;
Amendment 1158 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions to suspend or terminate or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
Amendment 1161 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions to suspend or terminate or not to suspend or terminate the recipients’ account.
Amendment 1166 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to radically restrict the visibility of content provided by the recipients,
Amendment 1171 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
Article 17 – paragraph 1 – point c b (new)
(cb) decisions to restrict the ability to monetise content provided by the recipients,
Amendment 1203 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1) and individuals or entities that have submitted notices, shall be entitled to select any out- of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 1213 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independentndependent, including financially independent, and impartial of online platforms and recipients of the service provided by the online platforms and of individuals or entities that have submitted notices;
Amendment 1221 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible through electronic communication technology and provides for the possibility to submit a complaint and the requisite supporting documents online;
Amendment 1236 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
Article 18 – paragraph 2 – subparagraph 1 – point e
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure that are clearly visible and easily accessible to all parties concerned and in full compliance with all applicable law.
Amendment 1242 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
Article 18 – paragraph 2 a (new)
2a. The Digital Services Coordinator shall reassess on a yearly basis whether the certified out-of-court dispute settlement body continues to fulfil the listed criteria. If this is not the case, the Digital Services Coordinator shall revoke the status from the out-of-court dispute settlement body.
Amendment 1251 #
Proposal for a regulation
Article 18 – paragraph 5
Article 18 – paragraph 5
5. Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph as well as out-of-court dispute settlement bodies whose status has been revoked. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated.
Amendment 1278 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform;
Amendment 1296 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or have been revoked in accordance with paragraph 6.
Amendment 1308 #
Proposal for a regulation
Article 19 – paragraph 6
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, carried out without undue delay, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
Amendment 1339 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
Article 20 – paragraph 3 – point d
(d) where identifiable, the intention of the recipient, individual, entity or complainant.
Amendment 1349 #
Proposal for a regulation
Article 20 – paragraph 4 a (new)
Article 20 – paragraph 4 a (new)
4a. Providers of hosting services could, as a voluntary measure in line with provisions Article 6, conduct own- investigation measures to prevent suspended accounts from reappearing before the suspension is lifted. The obligations related to paragraph 1 to 4 shall by no means impose general monitoring obligations on hosting services.
Amendment 1391 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
Article 22 – paragraph 1 – point f
(f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law and where applicable confirming that all products have been checked against the Union Rapid Alert System for dangerous non-food products (Rapex).
Amendment 1404 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) (e) and (ef) of paragraph 1 is reliable through the use of any freely accessible official online database, like the Rapex system or online interfaces made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. The online platform shall require that traders promptly inform them of any changes to the information referred to in points (a), (d), (e) and (f) and regularly repeat this verification process.
Amendment 1459 #
Proposal for a regulation
Article 22 a (new)
Article 22 a (new)
Article 22a Obligation to inform consumers and authorities about illegal products and services 1. Where an online platform allows consumers to conclude distance contracts with traders, it shall be subject to additional information obligations for consumers. Where the online platform becomes aware of the illegal nature of a product or services offered by a trader on its interface it shall: (a) immediately remove the illegal product from its interface and inform relevant authorities about it; (b) maintain an internal database of content removed and/or recipients suspended pursuant to Article 20 to be used by internal content moderation systems tackling the identified risks; (c) where the online platform has the contact details of the recipients of its services, inform such recipients of the service that have purchased said product or service during the past twelve months about the illegality, the identity of the trader and options for seeking redress; (d) compile and make publicly available through application programming interfaces a repository containing information about illegal products and services removed from its platform in the past six months along with information about the concerned trader and options for seeking redress.
Amendment 1476 #
Proposal for a regulation
Article 23 – paragraph 2 a (new)
Article 23 – paragraph 2 a (new)
2a. Member States shall refrain from imposing additional transparency reporting obligations on the online platforms, other than specific requests in the context of exercising their supervisory powers.
Amendment 1510 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
2. Online platforms shall provide information mentioned in paragraph 1 to public authorities, upon their request, in order to determine accountability in case of false or misleading advertisement.
Amendment 1534 #
Proposal for a regulation
Article 25 – paragraph 1 a (new)
Article 25 – paragraph 1 a (new)
1a. This section shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
Amendment 1552 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use madedissemination of illegal content ofn their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 1567 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively through dissemination of illegal content;
Amendment 1577 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative and illegal effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 1596 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
Article 26 – paragraph 2 a (new)
2a. The obligations detailed in paragraphs 1 and 2 shall by no means lead to a general monitoring obligation
Amendment 1604 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures targeting illegal practices, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 1661 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
(b) any voluntary commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.
Amendment 1702 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2a. Obligations pursuant to paragraphs 1 and 2 shall not oblige a very large online platform to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets and intellectual property rights. Further, very large online platforms shall not be required to enable modification of systems essential to uphold the safety and security of the service.
Amendment 1719 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one yearsix months after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1798 #
Proposal for a regulation
Article 33 – paragraph 1
Article 33 – paragraph 1
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every sixtwelve months.
Amendment 1808 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33a Algorithm transparency 1. When using automated decision making, the very large online platform shall upon request provide the Commission with the necessary information to assess the algorithms used. 2. When carrying out the assessments referred to in paragraph 1, the Commission shall consider the following elements: (a) the compliance with corresponding Union requirements; (b) potential negative effects on fundamental rights, including on consumer rights, through dissemination of illegal content; 3. Following an assessment the Commission shall communicate its findings to the very large online platform and allow it to provide additional explanation. 4. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a) or (b) of paragraph 2 of this Article, the Commission shall inform the Digital Service Coordinator of establishment of the very large online platform.
Amendment 1846 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. The Commission shall also encourage and facilitate regular review and adaption of the Codes of conduct to ensure that they are fit for purpose.
Amendment 1853 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested partierelevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1864 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments should take into account differences in size and capacity between different participants.
Amendment 1883 #
Proposal for a regulation
Article 36 – paragraph 1
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30.
Amendment 1897 #
Proposal for a regulation
Article 37 – paragraph 1
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
Amendment 1945 #
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 1 – point e
Article 41 – paragraph 2 – subparagraph 1 – point e
(e) the power to adopt proportionate interim measures to avoid the risk of serious harm, without prejudice to fundamental rights.
Amendment 2088 #
Proposal for a regulation
Article 49 – paragraph 1 – point d a (new)
Article 49 – paragraph 1 – point d a (new)
(da) monitor derogations from the internal market clause in accordance with Article 3 of Directive 2000/31/EC and ensure that the conditions for derogation are interpreted strictly and narrowly to ensure consistent application of this Regulation;
Amendment 2089 #
Proposal for a regulation
Article 49 – paragraph 1 – point e
Article 49 – paragraph 1 – point e
(e) support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in close collaboration with relevant stakeholders as provided for in this Regulation, as well as the identification of emerging issues, with regard to matters covered by this Regulation.
Amendment 2164 #
Proposal for a regulation
Article 55 – paragraph 1
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order proportionate interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement, without prejudice to fundamental rights.
Amendment 2182 #
Proposal for a regulation
Article 57 – paragraph 1
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, and where necessary access to, its databases and algorithms.
Amendment 2212 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
Article 59 – paragraph 2 – introductory part
2. The Commission may by decision and in compliance with the proportionality principle impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or as a result of repeated negligentlyce:
Amendment 2296 #
Proposal for a regulation
Article 74 – paragraph 2
Article 74 – paragraph 2
2. It shall apply from [date - threwelve months after its entry into force].