20 Amendments of Adrián VÁZQUEZ LÁZARA related to 2020/0361(COD)
Amendment 129 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, search engines, livestreaming platforms, messaging services or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
Amendment 172 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws in conformity with the Union law, including the Charter of Fundamental Rights of the European Union on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
Amendment 197 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. Furthermore, the notice and action mechanism should be complemented by ‘stay down’ provisions whereby providers of hosting services should demonstrate their best efforts in order to prevent from reappearing content which is identical to another piece of content that has already been identified and removed by them as illegal. The application of this requirement should not lead to any general monitoring obligation.
Amendment 211 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that have proven to be efficient, proportionate and reliable, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 261 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and condition and intentional manipulation and exploitation of the service, including amplification of harmful content, adapting their decision-making processes, or adapting their terms and conditions, as well as making content moderation policies and the way they are enforced fully transparent for the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 278 #
Proposal for a regulation
Recital 62 a (new)
Recital 62 a (new)
(62 a) The practice of very large online platforms to associate advertisement with content uploaded by users could indirectly lead to the monetisation and promotion of illegal content, or content that is in breach of their terms and conditions and could risk to considerably damage the brand image of the buyers of advertising space. In order to prevent such practice, the very large online platforms should ensure, including through standard contractual guarantees to the buyers of advertising space, that the content to which they associate advertisements is legal, and compliant with their terms and conditions. Furthermore, the very large online platforms should allow advertisers to have direct access to the results of audits carried out independently and evaluating the commitments and tools of platforms for protecting the brand image of the buyers of advertising space ('brand safety').
Amendment 326 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, intellectual property, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 565 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
Amendment 585 #
Proposal for a regulation
Chapter III – Section 2 – title
Chapter III – Section 2 – title
2 Additional provisions applicable to providers of hosting services, including online platforms and to providers of live streaming platform services and of private messaging services
Amendment 615 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services, of live streaming platform services and of private messaging services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, or in respect of the recipient of the service who provided this information, in a timely, diligent non-discriminatory and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 619 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
Amendment 734 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
Article 19 – paragraph 4 a (new)
4a. Member States may recognise entities that were awarded the status of trusted flaggers in another Member State as a trusted flagger on their own territory. Upon request by a Member State, trusted flaggers can be awarded the status of European trusted flagger by the Board, in accordance with Article 48, par. 2. The Commission shall keep register of European trusted flaggers.
Amendment 836 #
Proposal for a regulation
Article 24 – paragraph 1 c (new)
Article 24 – paragraph 1 c (new)
Online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request and obtain information on where their advertisements have been placed; (b) can request and obtain information on which broker treated their data; (c) can indicate on which specific location their ads cannot be placed. In case of non-compliance with this provision, advertisers shall have the right to judicial redress.
Amendment 843 #
Proposal for a regulation
Article 25 – title
Article 25 – title
Very large online platforms, live streaming platforms, private messaging providers and search engines
Amendment 1074 #
Proposal for a regulation
Article 43 a (new)
Article 43 a (new)
Article 43a Rights to effective judicial remedies 1. Without prejudice to any available administrative or non-judicial remedy, any recipient of the service or representative organisations shall have the right to an effective judicial remedy where he or she suffered harm as a result of an infringement of Articles 26(1) and 27(1). 2. In determining whether the very large online platform has complied with its obligations under Article 27(1), and in light of the principle of proportionality, the availability of suitable and effective measures shall be taken into account. 3. Such proceedings may be brought before the courts of the Member State where the recipient of the service has his or her habitual residence. 4. Without prejudice to any other administrative or non-judicial remedy, any recipients of the service or representative organisations shall have the right to an effective judicial remedy where the Digital Service Coordinator which is competent pursuant to Articles 40 and 43 does not handle a complaint or does not inform the recipient of the service within three months on the progress or outcome of the complaint lodged pursuant to Article 43. Proceedings against a Digital Services Coordinator under paragraph 4 shall be brought before the courts of the Member State where the Digital Services Coordinator is established.
Amendment 1664 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which have been selected by the Commission and:
Amendment 1711 #
Proposal for a regulation
Article 30 – title
Article 30 – title
Additional online advertising transparency and protection
Amendment 1738 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. Very large online platforms shall be prohibited from profiling children under the age of 16 for commercial practices, including personalized advertising, in compliance with industry- standards laid down in Article 34 and Regulation (EU) 2016/679.
Amendment 1835 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
Article 34 – paragraph 1 a (new)
1a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child, observance of which, once adopted will be mandatory for very large online platforms, at least for the following: (a) age assurance and age verification; (b) child impact assessments; (c) child-centred and age-appropriate design; (d) child-centred and age-appropriate terms and conditions.
Amendment 1893 #
Proposal for a regulation
Article 36 a (new)
Article 36 a (new)
Article 36a Codes of conduct for the protection of minors 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms and other relevant services providers and organisations representing minors, parents and civil society organisations or relevant authorities to further contribute to the protection of minors on online. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective protection of minors online, which respects their right as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment. The Commission shall aim to ensure that the codes of conduct address at least: (a) age verification and age assurance models, taking into account the industry standards referred to in article 34. (b) child-centred and age-appropriate design, taking into account the industry standards referred to in Article 34. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of the Regulation and their application no later than six months after that date.