126 Amendments of Maria da Graça CARVALHO related to 2020/0361(COD)
Amendment 141 #
Proposal for a regulation
Recital 43
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or, small and medium-sized enterprises (SMEs) as defined in Recommendation 2003/361/EC of the Commission,.41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small, small and medium-sized enterprises (SMEs) enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 142 #
Proposal for a regulation
Recital 43 a (new)
Recital 43 a (new)
(43 a) To similarly avoid unnecessary regulatory burden, certain obligations should not apply to online platforms offering products and services from third- party traders, which are established in the European Union, where these traders' access is exclusive, curated and entirely controlled by the providers of the online platform and these traders’ products and services are reviewed and pre-approved by the providers of the online platform before they are offered on the platform. These online platforms are often referred to as closed online platforms. As the products and services offered are reviewed and pre-approved by the online platforms, the prevalence of illegal content and products on these platforms is low, and these platforms cannot benefit from relevant liability exemptions outlined in this Regulation. These online platforms should subsequently not be subjected to the obligations which are necessary for platforms with different operational models where the prevalence of illegal content is more frequent and the relevant liability exemptions are available.
Amendment 151 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms, having received guidance from public authorities on how to identify illegal content, take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 161 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot, as this would be disproportionate. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties or be liable for this information in case it proves to be inaccurate. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 175 #
Proposal for a regulation
Recital 54
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. In certain cases, online platforms whose number of recipients does not exceed the operational threshold set at 10% of the Union population should also be considered very large online platforms due to their role in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online.
Amendment 183 #
Proposal for a regulation
Recital 1
Recital 1
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25, new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and, challenges and opportunities, both for individual users and for companies and society as a whole. __________________ 25Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
Amendment 184 #
Proposal for a regulation
Recital 2
Recital 2
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market by undermining its integrity, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross- border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice.
Amendment 185 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. A disclaimer of an opinion should be given where the auditor does not have enough information to conclude on an opinion due to the novelty of the issues audited.
Amendment 186 #
Proposal for a regulation
Recital 2 a (new)
Recital 2 a (new)
(2a) Moreover, complex national regulatory requirements, fragmented implementation and insufficient enforcement of legislation such as Directive 2000/31/EC have contributed to high administrative costs and legal uncertainty for intermediary services operating on the internal market, especially micro, small and medium sized companies.
Amendment 190 #
Proposal for a regulation
Recital 4
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers, protecting consumers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated.
Amendment 213 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provisionfully harmonises the rules applicable to intermediary services in the internal market with the objective to ensure a safe and trusted online environment, effective protection of fundamental rights and a favourable business climate. Accordingly, Member States should not adopt or maintain additional national requirements on those matters falling within the scope of this Regulation. This does not preclude the possibility to apply other national legislation applicable to providers of intermediary services, in particular Directive 2000/31/ECaccordance with Union law, including Directive 2000/31/EC, in particular its Article 3, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. . __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 225 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3 a (new)
Article 2 – paragraph 1 – point f – indent 3 a (new)
- Providers of not-for-profit scientific or educational repositories are not considered an intermediary service within the meaning of this Regulation.
Amendment 226 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that conceptFor the purpose of this Regulation the concept of “illegal content” should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relateit is not in compliance with Union law as it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 249 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Services, such as internet infrastructure services or cloud service providers, which are provided at the request of parties other than the content providers and only indirectly benefitting the latter, should not be covered by the definition of online platforms. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 278 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expedwitihouslt undue delay to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 281 #
Proposal for a regulation
Recital 22 a (new)
Recital 22 a (new)
(22a) The exemption of liability should not apply where the recipient of the service is acting under the authority or the control of the provider of a hosting service. In particular, where the provider of the online platform that allows consumers to conclude distance contracts with traders does not allow traders to determine the basic elements of the trader-consumer contract, such as the terms and conditions governing such relationship or the price, it should be considered that the trader acts under the authority or control of that platform.
Amendment 282 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders as a functionality of their service, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. This is the case where the online platform operator fails to clearly display the identity of the trader following this Regulation. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. In particular, it is relevant whether the online platform operator withholds such identity or contract details until after the conclusion of the trader- consumer contract, or is marketing the product or service in its own name rather than using the name of the trader who will supply it.
Amendment 283 #
5 a. Providers of intermediary services that qualify as micro, small or medium- sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation.
Amendment 291 #
Proposal for a regulation
Recital 23 a (new)
Recital 23 a (new)
(23a) Consumers should be able to safely purchase products and services online, irrespective of whether a product or service has been produced in the Union. For that reason, traders from third countries should establish a legal representative in the Union to whom claims regarding product safety could be addressed. Providers of intermediary services from inside the Union as well as from third countries should ensure compliance with product requirements set out in Union law.
Amendment 310 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, cloud infrastructure services or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 315 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. , small or medium-sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a micro, small or medium-sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 319 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices at scale and exclusively by electronic means.
Amendment 331 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. Since intermediaries should not be required to remove information which is legal in their country of establishment, national and Union authorities should be able to order the blocking of content legally published outside the Union only for the territory of the Union where Union law is infringed and for the territory of the issuing Member State where national law is infringed.
Amendment 344 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and, ensure a safe and transparent online environment and provide a high level of protection for European consumers, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 346 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
Article 14 – paragraph 6 b (new)
6 b. Paragraphs 2, 4 and 5 shall not apply to providers of intermediary services that qualify as micro, small or medium- sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraphs 2, 4 and 5 shall not apply to enterprises that previously qualified for the status of a micro, small or medium-sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 353 #
Proposal for a regulation
Recital 35
Recital 35
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature and size of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non- arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.
Amendment 358 #
Proposal for a regulation
Recital 36 a (new)
Recital 36 a (new)
(36a) Providers of intermediary services should also establish a single point of contact for recipients of services, allowing rapid, direct and efficient communication.
Amendment 359 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
Article 15 – paragraph 4 a (new)
Amendment 362 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. Obligations related to terms and conditions should not oblige a provider of an intermediary service to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets or intellectual property rights.
Amendment 364 #
Proposal for a regulation
Article 16 – title
Article 16 – title
Exclusion for micro and small enterprise, small and medium- sized enterprises (SMEs) and closed online platforms
Amendment 369 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises micro, small or medium-sized enterprise (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 372 #
Proposal for a regulation
Article 16 – paragraph 1 a (new)
Article 16 – paragraph 1 a (new)
This Section shall not apply to enterprises that previously qualified for the status of micro, small or medium-sized enterprise (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 372 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or, small or medium sized enterprises as defined in Commission Recommendation 2003/361/EC.40 __________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 376 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action')following the applicable law ('action'). Such mechanisms should be clearly visible on the interface of the hosting service and easy to use. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. Providers of hosting services could, as a voluntary measure, conduct own-investigation measures to prevent content which has previously been identified as illegal from being disseminated again once removed. The obligations related to notice and action should by no means impose general monitoring obligations.
Amendment 385 #
Proposal for a regulation
Recital 41
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content. Providers of hosting services should act upon notices without undue delay, taking into account the type of illegal content that is being notified and the urgency of taking action. The provider of hosting services should inform the individual or entity notifying the specific content of its decision without undue delay after taking a decision whether to act upon the notice or not.
Amendment 400 #
Proposal for a regulation
Recital 43
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or, small and medium-sized enterprises as defined in Recommendation 2003/361/EC of the Commission41, unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small, small and medium-sized enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. __________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 401 #
Proposal for a regulation
Recital 43
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or, small or medium sized enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. __________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 403 #
Proposal for a regulation
Recital 43 a (new)
Recital 43 a (new)
(43a) To similarly avoid unnecessary regulatory burdens, certain obligations should not apply to hosting service providers often referred to as closed online platforms where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
Amendment 405 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift, non- discriminatory and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint- handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effectivimple, affordable, expedient and accessible manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 418 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. Providers of hosting services could, as a voluntary measure, introduce own-investigation measures to prevent accounts which have previously been identified as illegal from reappearing once removed. The obligations related to notice and action should by no means impose general monitoring obligations. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 438 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 445 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 ,and the Union Rapid Alert System for dangerous non-food products (Rapex) or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . __________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 455 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However,Online advertising is a significant source of financing for many digital business models and an effective tool to reach new customers, not least for small- and medium sized companies. However, there are some instances when online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. To ensure consumer protection online advertisement should be subject to proportionate and meaningful transparency obligations. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 469 #
Proposal for a regulation
Recital 54
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses could have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and meansAccordingly, the number of average monthly recipients of the service should reflect the recipients actually reached by the service either by being exposed to content or by providing content disseminated on the platforms’ interface in that period of time. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. The threshold should be designed to target the largest platforms with a reach in the Union that could lead to a systemic impact. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means, placing such due diligence obligations on smaller companies, especially micro, small and medium sized companies would be disproportionate.
Amendment 474 #
Proposal for a regulation
Recital 56
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design of their services is generally optimised to benefit their often advertising- driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, withoutsometimes amplify the dissemination of illegal content. Effective regulation and enforcement is needed to effectively identifying and mitigatinge the risks and the societal and economic harm they can cauat may arise. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
Amendment 490 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delayin 30 days following its adoption, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
Amendment 496 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. Often, they facilitate the search for relevant content for recipients of the service and contribute to an improved user experience. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them through making active choices. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them and why. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
Amendment 500 #
Proposal for a regulation
Recital 63
Recital 63
(63) Advertising systems used by very large online platforms could pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
Amendment 506 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3 or where the operating model and nature of the platform is considered to constitute a systemic risk assessed calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. This Section shall not apply to online platforms that qualify as micro, small or medium-sized enterprises (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, this Section shall not apply to enterprises that previously qualified for the status of a medium-sized, small or microenterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 506 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, where relevant to a research project. All requiremenests for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 507 #
Proposal for a regulation
Recital 64 a (new)
Recital 64 a (new)
(64a) Recognition of an audit report should not interrupt or hinder the platform’s legitimate freedom to continue with its activities and business plan;
Amendment 520 #
Proposal for a regulation
Recital 68
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fakintentionally inaccurate or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerablecertain groups of recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
Amendment 613 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 614 #
Proposal for a regulation
Article 1 – paragraph 2 – point b – point i (new)
Article 1 – paragraph 2 – point b – point i (new)
i) facilitate innovations, support digital transition, encourage economic growth and create a level playing field for digital services within the internal market while strengthening consumer protection and contributing to increased consumer choice.
Amendment 665 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
Article 2 – paragraph 1 – point d – indent 2
— the targedirecting of activities towards one or more Member States.
Amendment 697 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
Amendment 720 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed and disseminated to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically in exchange for promoting that information;
Amendment 761 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expedwitihouslt undue delay to remove or to disable access to the illegal content.
Amendment 787 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry outtake the necessary voluntary own-initiative investigations or other activiti measures aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation, without prejudice to freedom of expression.
Amendment 790 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Article 6 – paragraph 1 a (new)
Providers of intermediary services shall ensure that such measures are accompanied with appropriate safeguards, such as oversight, documentation and traceability or additional measures to ensure that own- initiative investigations are accurate, legally justified and do not lead to over- removal of content.
Amendment 857 #
Proposal for a regulation
Article 8 a (new)
Article 8 a (new)
Article 8a Injunction orders Member States shall ensure that recipients of a service are entitled under their national law to seek an injunction order as an interim measure for removing manifestly illegal content.
Amendment 897 #
Proposal for a regulation
Article 10 – title
Article 10 – title
Points of contact for authorities, the Commission and the Board
Amendment 903 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. Providers of intermediary services shall make publiccommunicate to their Digital Service Coordinator of establishment, the Commission and the Board the information necessary to easily identify and communicate with their single points of contact.
Amendment 908 #
Proposal for a regulation
Article 10 a (new)
Article 10 a (new)
Article 10a Point of contact for recipients of a service 1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with the recipients of their services. The means of communication shall be user-friendly and easily accessible. 2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact for recipients.
Amendment 918 #
Proposal for a regulation
Article 11 – paragraph 4 a (new)
Article 11 – paragraph 4 a (new)
4a. Providers of intermediary services that would qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC if established in the Union, and who have been unsuccessful in designating a legal representative after reasonable efforts, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including the possibility for collective representation.
Amendment 925 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including information about algorithmic decision-making and human review. ItProviders of intermediary services shall also include information on the right to terminate the use of the service. The possibility to terminate must be easily accessible for the user. Information on remedies and redress mechanisms shall also be included in the terms and conditions. The terms and conditions shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 950 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Obligations pursuant to paragraph 1 and 2 should not oblige a provider of an intermediary service to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets or intellectual property rights.
Amendment 975 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular,including information on the following, as applicable:
Amendment 989 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
Article 13 – paragraph 1 – point c
(c) meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures;
Amendment 1002 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, small or medium sized enterprises (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a medium-sized, small or micro-enterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 1006 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or, small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 1009 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
Article 13 – paragraph 2 a (new)
2a. Paragraph 1 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
Amendment 1030 #
Proposal for a regulation
Article 14.º – paragraph 1
Article 14.º – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, easy to understand, user- friendly, and allow for the submission of notices exclusively by electronic means.
Amendment 1060 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 on the basis of which a diligent provider of hosting services is able to assess the illegality of the content in question, shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
Amendment 1064 #
Proposal for a regulation
Article 14 – paragraph 4
Article 14 – paragraph 4
4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly, without undue delay, send a confirmation of receipt of the notice to that individual or entity.
Amendment 1067 #
Proposal for a regulation
Article 14 – paragraph 5
Article 14 – paragraph 5
5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing clear and conclusive information on the redress possibilities in respect of that decision.
Amendment 1081 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Providers of hosting services could, as a voluntary measure in line with provisions Article 6, conduct own- investigation measures to prevent illegal content which has previously been identified as illegal from being disseminated again once removed. The obligations related to paragraph 1 to 6 shall by no means impose general monitoring obligations on hosting services.
Amendment 1089 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
Article 14 – paragraph 6 c (new)
6c. Paragraph 2 and 4-5 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
Amendment 1096 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to or radically restrict the visibility of specific items of information provided by the recipients of the service, or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or removing or disabling access to or for restricting the visibility or monetisation of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time ofwithout undue delay and at the latest within 24 hours after the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 1102 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the or radical restriction of the visibility of, the information or the suspension or termination of monetary payments related to that information and, where relevant, the territorial scope of the disabling of access;
Amendment 1120 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
4. Providers of hosting services shall publishupon request share the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commissionwith the Digital Service Coordinator of establishment. That information shall not contain personal data.
Amendment 1124 #
Proposal for a regulation
Article 15 – paragraph 4 b (new)
Article 15 – paragraph 4 b (new)
4b. Paragraph 2 to 4 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
Amendment 1129 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 1157 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or not to remove or disable access to the information;
Amendment 1158 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions to suspend or terminate or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
Amendment 1161 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions to suspend or terminate or not to suspend or terminate the recipients’ account.
Amendment 1166 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to radically restrict the visibility of content provided by the recipients,
Amendment 1171 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
Article 17 – paragraph 1 – point c b (new)
(cb) decisions to restrict the ability to monetise content provided by the recipients,
Amendment 1203 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1) and individuals or entities that have submitted notices, shall be entitled to select any out- of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 1213 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independentndependent, including financially independent, and impartial of online platforms and recipients of the service provided by the online platforms and of individuals or entities that have submitted notices;
Amendment 1221 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible through electronic communication technology and provides for the possibility to submit a complaint and the requisite supporting documents online;
Amendment 1236 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
Article 18 – paragraph 2 – subparagraph 1 – point e
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure that are clearly visible and easily accessible to all parties concerned and in full compliance with all applicable law.
Amendment 1251 #
Proposal for a regulation
Article 18 – paragraph 5
Article 18 – paragraph 5
5. Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph as well as out-of-court dispute settlement bodies whose status has been revoked. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated.
Amendment 1262 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by certified trusted flaggers, within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay, depending on the severity of the illegal activity.
Amendment 1296 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or have been revoked in accordance with paragraph 6.
Amendment 1308 #
Proposal for a regulation
Article 19 – paragraph 6
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, carried out without undue delay, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
Amendment 1339 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
Article 20 – paragraph 3 – point d
(d) where identifiable, the intention of the recipient, individual, entity or complainant.
Amendment 1391 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
Article 22 – paragraph 1 – point f
(f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law and where applicable confirming that all products have been checked against the Union Rapid Alert System for dangerous non-food products (Rapex).
Amendment 1404 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) (e) and (ef) of paragraph 1 is reliable through the use of any freely accessible official online database, like the Rapex system or online interfaces made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. The online platform shall require that traders promptly inform them of any changes to the information referred to in points (a), (d), (e) and (f) and regularly repeat this verification process.
Amendment 1413 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Article 22 – paragraph 3 – subparagraph 1
Where the online platform obtains indications that anyinformation under paragraph 1, letter (f) is inaccurate it shall remove the product or service directly from their online platform and if any other item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
Amendment 1459 #
Proposal for a regulation
Article 22 a (new)
Article 22 a (new)
Article 22a Obligation to inform consumers and authorities about illegal products and services 1. Where an online platform allows consumers to conclude distance contracts with traders, it shall be subject to additional information obligations for consumers. Where the online platform becomes aware of the illegal nature of a product or services offered by a trader on its interface it shall: (a) immediately remove the illegal product from its interface and inform relevant authorities about it; (b) maintain an internal database of content removed and/or recipients suspended pursuant to Article 20 to be used by internal content moderation systems tackling the identified risks; (c) where the online platform has the contact details of the recipients of its services, inform such recipients of the service that have purchased said product or service during the past twelve months about the illegality, the identity of the trader and options for seeking redress; (d) compile and make publicly available through application programming interfaces a repository containing information about illegal products and services removed from its platform in the past six months along with information about the concerned trader and options for seeking redress.
Amendment 1473 #
Proposal for a regulation
Article 23 – paragraph 2
Article 23 – paragraph 2
2. Online platforms shall pucommunicate to the Digital Services Coordinator of establishment, at least once every sixtwelve months, information on the average monthly active recipients of the service in each Member Statethe Union, calculated as an average over the period of the past sixtwelve months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
Amendment 1476 #
Proposal for a regulation
Article 23 – paragraph 2 a (new)
Article 23 – paragraph 2 a (new)
2a. Member States shall refrain from imposing additional transparency reporting obligations on the online platforms, other than specific requests in the context of exercising their supervisory powers.
Amendment 1510 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
2. Online platforms shall provide information mentioned in paragraph 1 to public authorities, upon their request, in order to determine accountability in case of false or misleading advertisement.
Amendment 1512 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Article 24 – paragraph 1 b (new)
3. Providers of intermediary services shall obtain consent from the recipients of their service, in order to provide them with micro targeted and behavioural advertisement. Providers of intermediary services shall ensure that recipients of services can easily make an informed choice when expressing their consent by providing them with meaningful information.
Amendment 1552 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use madedissemination of illegal content ofn their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 1567 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively through dissemination of illegal content;
Amendment 1577 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative and illegal effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 1587 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
Amendment 1604 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures targeting illegal practices, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 1661 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
(b) any voluntary commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.
Amendment 1665 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned and have not provided any other service to the platform in the previous 12 months;
Amendment 1719 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one yearsix months after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1767 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, disclose the funding of the research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 1798 #
Proposal for a regulation
Article 33 – paragraph 1
Article 33 – paragraph 1
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every sixtwelve months.
Amendment 1808 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33a Algorithm transparency 1. When using automated decision making, the very large online platform shall upon request provide the Commission with the necessary information to assess the algorithms used. 2. When carrying out the assessments referred to in paragraph 1, the Commission shall consider the following elements: (a) the compliance with corresponding Union requirements; (b) potential negative effects on fundamental rights, including on consumer rights, through dissemination of illegal content; 3. Following an assessment the Commission shall communicate its findings to the very large online platform and allow it to provide additional explanation. 4. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a) or (b) of paragraph 2 of this Article, the Commission shall inform the Digital Service Coordinator of establishment of the very large online platform.
Amendment 1853 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested partierelevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1864 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments should take into account differences in size and capacity between different participants.
Amendment 1945 #
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 1 – point e
Article 41 – paragraph 2 – subparagraph 1 – point e
(e) the power to adopt proportionate interim measures to avoid the risk of serious harm, without prejudice to fundamental rights.
Amendment 2039 #
Proposal for a regulation
Article 47 – paragraph 2 – point a a (new)
Article 47 – paragraph 2 – point a a (new)
(aa) contributing to the effective application of Article 3 of Directive 2000/31/EC to prevent fragmentation of the digital single market;
Amendment 2088 #
Proposal for a regulation
Article 49 – paragraph 1 – point d a (new)
Article 49 – paragraph 1 – point d a (new)
(da) monitor derogations from the internal market clause in accordance with Article 3 of Directive 2000/31/EC and ensure that the conditions for derogation are interpreted strictly and narrowly to ensure consistent application of this Regulation;
Amendment 2089 #
Proposal for a regulation
Article 49 – paragraph 1 – point e
Article 49 – paragraph 1 – point e
(e) support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in close collaboration with relevant stakeholders as provided for in this Regulation, as well as the identification of emerging issues, with regard to matters covered by this Regulation.
Amendment 2164 #
Proposal for a regulation
Article 55 – paragraph 1
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order proportionate interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement, without prejudice to fundamental rights.
Amendment 2182 #
Proposal for a regulation
Article 57 – paragraph 1
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, and where necessary access to, its databases and algorithms.
Amendment 2212 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
Article 59 – paragraph 2 – introductory part
2. The Commission may by decision and in compliance with the proportionality principle impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or as a result of repeated negligentlyce:
Amendment 2296 #
Proposal for a regulation
Article 74 – paragraph 2
Article 74 – paragraph 2
2. It shall apply from [date - threwelve months after its entry into force].