75 Amendments of Tomas TOBÉ related to 2020/0361(COD)
Amendment 81 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. The Commission and the Member States should provide guidance to on how to identify the illegal content.
Amendment 94 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
Amendment 105 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that contenassess the grounds for and, when necessary, proceed to removing or disabling access to all copies of that content, and, in accordance with the jurisprudence of the Court of Justice of the European Union, ensure that identical or equivalent illegal content does not reappear within the same context. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 112 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage automated or non- automated activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner for the sole purpose of detecting, identifying and acting against illegal content. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union or national law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability set out in this Regulation. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 121 #
Proposal for a regulation
Recital 30
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The orders to act against illegal content may require providers of intermediary services to take steps, in the specific case, to remove identical or equivalent illegal content, within the same context. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
Amendment 128 #
Proposal for a regulation
Recital 2 a (new)
Recital 2 a (new)
(2 a) Complex regulatory requirements both on Union and Member State level have contributed to high administrative costs and legal uncertainty for intermediary services operating on the internal market, especially small and medium sized companies, adding to the risk of discriminatory practices in the Member States.
Amendment 138 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that conceptFor the purpose of this Regulation the concept of “illegal content” should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relateit is not in compliance with Union law since it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant dangerous or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 139 #
Proposal for a regulation
Recital 42 a (new)
Recital 42 a (new)
(42 a) Hosting services providers should not be subject to the obligation to provide a statement of reasons when doing so would infringe on a legal right or cause unintended safety concerns for the recipient of the service. Specifically in cases of one-to-one interface platforms, such as dating applications and other similar services, providing the statement of reasons should be considered such as to likely cause unintended safety concerns for the reporting party. As a result of this, dating applications and other similar services should by default refrain from providing statements of reasons. Additionally, other providers of hosting services should make reasonable efforts to assess if providing a statement of reasons could cause unintended safety concerns to the reporting party, and in such cases, refrain from providing a statement of reasons.
Amendment 141 #
Proposal for a regulation
Recital 43
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or, small and medium-sized enterprises (SMEs) as defined in Recommendation 2003/361/EC of the Commission,.41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small, small and medium-sized enterprises (SMEs) enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 142 #
Proposal for a regulation
Recital 43 a (new)
Recital 43 a (new)
(43 a) To similarly avoid unnecessary regulatory burden, certain obligations should not apply to online platforms offering products and services from third- party traders, which are established in the European Union, where these traders' access is exclusive, curated and entirely controlled by the providers of the online platform and these traders’ products and services are reviewed and pre-approved by the providers of the online platform before they are offered on the platform. These online platforms are often referred to as closed online platforms. As the products and services offered are reviewed and pre-approved by the online platforms, the prevalence of illegal content and products on these platforms is low, and these platforms cannot benefit from relevant liability exemptions outlined in this Regulation. These online platforms should subsequently not be subjected to the obligations which are necessary for platforms with different operational models where the prevalence of illegal content is more frequent and the relevant liability exemptions are available.
Amendment 151 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms, having received guidance from public authorities on how to identify illegal content, take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 157 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders on the platforms should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 161 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot, as this would be disproportionate. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties or be liable for this information in case it proves to be inaccurate. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 178 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should target illegal practices and aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 185 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. A disclaimer of an opinion should be given where the auditor does not have enough information to conclude on an opinion due to the novelty of the issues audited.
Amendment 197 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, depending on the severity of the illegal activity without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 211 #
Proposal for a regulation
Recital 53
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no proportionate alternative and less restrictive measures that would effectively achieve the same result.
Amendment 225 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3 a (new)
Article 2 – paragraph 1 – point f – indent 3 a (new)
- Providers of not-for-profit scientific or educational repositories are not considered an intermediary service within the meaning of this Regulation.
Amendment 266 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry outtake the necessary voluntary own-initiative investigation measures for other activities aimed at sole purpose of detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.
Amendment 276 #
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
Article 1 – paragraph 2 – point b a (new)
(b a) facilitate innovations, support digital transition, encourage economic growth and create a level playing field for digital services within the internal market.
Amendment 283 #
5 a. Providers of intermediary services that qualify as micro, small or medium- sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation.
Amendment 285 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 315 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. , small or medium-sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a micro, small or medium-sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 319 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices at scale and exclusively by electronic means.
Amendment 320 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry outtake the necessary voluntary own-initiative investigations or other activitimeasures aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.
Amendment 329 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic loidentification of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 344 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. Where a provider of hosting services processes a notice and decides to remove or disable access to specific items of information provided by the recipients of the service, it shall take steps, in the specific case, to remove identical or equivalent illegal content, within the same context.
Amendment 346 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
Article 14 – paragraph 6 b (new)
6 b. Paragraphs 2, 4 and 5 shall not apply to providers of intermediary services that qualify as micro, small or medium- sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraphs 2, 4 and 5 shall not apply to enterprises that previously qualified for the status of a micro, small or medium-sized enterprise (SME) within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 348 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of thewithout undue delay and at latest within 24 hours after such removaling or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 352 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
Amendment 358 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
Amendment 359 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
Article 15 – paragraph 4 a (new)
Amendment 360 #
Proposal for a regulation
Article 15 – paragraph 4 b (new)
Article 15 – paragraph 4 b (new)
4 b. Providers of hosting services shall not be obliged to provide a statement of reasons referred to in paragraph 1 where doing so would infringe a legal obligation or where the statement of reasons could cause unintended safety concerns for the reporting party. In addition, providers of hosting services shall not be obliged to provide a statement of reasons referred to in paragraph 1 where the provider can demonstrate that the recipient of the service has repeatedly provided illegal content
Amendment 362 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15 a Protection against repeated misuse and criminal offences 1. Providers of intermediary services shall, after having issued a prior warning, suspend or in appropriate circumstances terminate the provision of their services to recipients of the service that frequently provide illegal content. 2. Where a provider of intermediary service becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. Where the provider of intermediary service cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it has its main establishment or has its legal representative and also transmit this information to Europol for appropriate follow-up.
Amendment 364 #
Proposal for a regulation
Article 16 – title
Article 16 – title
Exclusion for micro and small enterprise, small and medium- sized enterprises (SMEs) and closed online platforms
Amendment 369 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises micro, small or medium-sized enterprise (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 372 #
Proposal for a regulation
Article 16 – paragraph 1 a (new)
Article 16 – paragraph 1 a (new)
This Section shall not apply to enterprises that previously qualified for the status of micro, small or medium-sized enterprise (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 373 #
Proposal for a regulation
Article 16 – paragraph 1 b (new)
Article 16 – paragraph 1 b (new)
This Section shall not apply to online platforms offering products and services from third-party traders, which are established in the European Union, where these traders' access is exclusive, curated and entirely controlled by the providers of the online platform and these traders’ products and services are reviewed and pre-approved by the providers of the online platform before they are offered on the platform.
Amendment 388 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
Amendment 391 #
1. RAfter internal complaint handling mechanisms are exhausted, recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of- court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 393 #
Proposal for a regulation
Article 11 – paragraph 4 a (new)
Article 11 – paragraph 4 a (new)
4 a. Providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable efforts, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including the possibility for collective representation.
Amendment 417 #
Proposal for a regulation
Article 19 – paragraph 7 a (new)
Article 19 – paragraph 7 a (new)
7 a. Online platforms shall, where possible, provide trusted flaggers with access to technical means that help them detect illegal content on a large scale.
Amendment 422 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, or in appropriate circumstances terminate, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
Amendment 425 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or, small or medium- sized enterprises (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a medium-sized, small or micro-enterprise within the meaning of the Annex to Article 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
Amendment 427 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 430 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
Amendment 434 #
Proposal for a regulation
Article 20 – paragraph 4
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension, and the circumstances in which they will terminate their services.
Amendment 437 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is has its main establishedment or has its legal representative or inform Europoland also transmit this information to Europol for appropriate follow-up.
Amendment 442 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with tradersprofessional traders on the platform, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained from the trader the following information:
Amendment 446 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned, if the illegality of the specific item of information is sufficiently precise and adequately substantiated based on the assessment of the provider.
Amendment 448 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
Article 22 – paragraph 1 – point c
Amendment 450 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
Article 22 – paragraph 1 – point d
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law; _________________ 51o the extent the contract relates to products that are subject to the Union Regulations listed in Article 4(5) of Regulation (EU) 2019/1020 of the European Parliament and the Council, the name, address, telephone number and electronic mail address of the economic operator, established in the Union, referred to in Article 4(1) of Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).r any relevant act of Union law;
Amendment 455 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, mtake reasonable effortseffective steps that would reasonably be taken by a diligent operator in accordance with a high industry standard of professional diligence to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is accurate, current and reliable through the use of independent and reliable sources including any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. The provider of intermediary services should require that traders promptly inform them of any changes to the information referred to in points (a), (d) and (e) and regularly repeat this verification process.
Amendment 457 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall, where the information provided is sufficiently clear, process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 461 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. Paragraphs 2 and 4-5 shall not apply to providers of intermediary services that qualifies as micro, small or medium- sized enterprises within the meaning of the Annex to Recommendations 2003/361/EU, or to those enterprises within twelve months of them losing such status pursuant to Article 4(2) thereof.
Amendment 462 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Article 22 – paragraph 3 – subparagraph 1
3. Where the online platform obtains indications, through its effective steps that would reasonably be taken by a diligent operator under paragraph 2 or through Member States’ consumer authorities, that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate, out of date or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
Amendment 478 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
Amendment 479 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
Article 15 – paragraph 4 a (new)
4 a. Paragraph 2-4 shall not apply to providers of intermediary services that qualify as micro, small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC, or during the first twelve months from when an enterprise lost such status as pursuant to Article 4(2) thereof.
Amendment 485 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or, small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC, nor during the first twelve months to such enterprises following the loss of such status pursuant to Article 4(2) thereof.
Amendment 537 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by certified trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delawithout delay, depending on the severity of the illegal activity.
Amendment 572 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
Article 20 – paragraph 3 – point d
(d) where identifiable, the intention of the recipient, individual, entity or complainant.
Amendment 582 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned; and have not provided any other service to the platform in the previous 12 months.
Amendment 584 #
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
Article 28 – paragraph 2 – point c a (new)
(c a) have not audited the same very large online platform for more than 3 consecutive years.
Amendment 587 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with professional traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
Amendment 587 #
Proposal for a regulation
Article 28 – paragraph 3 – point f
Article 28 – paragraph 3 – point f
(f) where the audit opinion is not positive, operationalegative, recommendations on specific measures to achieve compliance and risk-based remediation timelines with a focus on rectifying issues that have the potential to cause most harm to users of the service as a priority.
Amendment 588 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
Article 28 – paragraph 3 – point f a (new)
(f a) where the organisations that perform the audits do not have enough information to conclude an opinion due to the novelty of the issues audited, a relevant disclaimer.
Amendment 589 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
Article 22 – paragraph 1 – point c
Amendment 591 #
Proposal for a regulation
Article 28 – paragraph 4 a (new)
Article 28 – paragraph 4 a (new)
4 a. Digital Services Coordinators shall provide very large online platforms under their jurisdiction with an annual audit plan outlining the key areas of focus for the upcoming audit cycle.
Amendment 616 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, for advertisements that have been seen by more than 5 000 recipients of the service and until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 620 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.
Amendment 638 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures targeting illegal practices, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 776 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shallould encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
Amendment 778 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations, and other interested parties,relevant stakeholders to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 780 #
Proposal for a regulation
Article 35 – paragraph 5
Article 35 – paragraph 5
5. The Board shallould regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.
Amendment 910 #
Proposal for a regulation
Article 74 – paragraph 2 – introductory part
Article 74 – paragraph 2 – introductory part
2. It shall apply from [date - threwelve months after its entry into force].