27 Amendments of Ibán GARCÍA DEL BLANCO related to 2020/2019(INL)
Amendment 1 #
Draft opinion
Paragraph 1
Paragraph 1
1. CRecalls for steps to be taken to safeguard the availability of content for whichthat free and pluralistic media are the backbone of our democratic societies; recalls that traditional media services are strongly regulated, including by self-regulation, in order to ensure freedom of expression and editorial fresponsibility is taken or which is produced by journalists and all other media that areedom for the content published in their media; calls for steps to be taken to ensure that the same principles apply to online and offline media services in order to establish a level playing field; considers that it is important to safeguard content for which editorial responsibility is taken or which is already subject to a generally recognised independent oversight, when such content is available on other platforms or in other services, so that their contenit is not subjected to any further controlsthe same rules and supervision;
Amendment 2 #
Motion for a resolution
Citation 3 a (new)
Citation 3 a (new)
- having regard to Directive (EU) 2019/790 of the European Parliament and of the Council of 17April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC,
Amendment 9 #
1 a. Recalls that transparency obligations on media platforms and services operating online should also apply to their ownership and their funding sources;
Amendment 14 #
Draft opinion
Paragraph 2
Paragraph 2
2. NPoints out that the media services using automated procedures may be also subject to the ethical principles of transparency and accountability as well as human oversight; notes that communication always takes place in a given context, which is why automated procedures may support individual decisions on the legality of content, but may under no circumstances replace thembe subject to human control;
Amendment 17 #
Motion for a resolution
Recital A a (new)
Recital A a (new)
Aa. whereas Directive (EU) 2018/18081 has recently updated many of the rules applicable to audiovisual media services, including video-sharing platforms, and must be implemented by Member States by 19 September 2020.
Amendment 18 #
Motion for a resolution
Recital A b (new)
Recital A b (new)
Ab. whereas Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC, have established new rules for online content-sharing providers and must be implemented by Member States by 7 June 2021.
Amendment 20 #
Draft opinion
Paragraph 2 a (new)
Paragraph 2 a (new)
2 a. Notes that the fight against disinformation, misinformation and mal- information spreading on media platforms, including social media platforms, requires significant corporate social responsibility, based on trust and transparency, in order to counter propaganda and hate speech undermining the Union principles and values;
Amendment 36 #
Motion for a resolution
Recital E
Recital E
E. whereas content hosting platforms evolved from involving the mere display of content into sophisticated bodies and market players, in particular in the case of social networks that harvest and exploit usage data; whereas users have reasonable grounds to expect fair terms for the usage of such platforms and for the use that platforms make of the users’ data; ;
Amendment 40 #
Motion for a resolution
Recital E a (new)
Recital E a (new)
Ea. whereas transparency of digital services and content hosting platforms could contribute to increasing the significant form of trust in them by companies and users of these services.
Amendment 50 #
Motion for a resolution
Recital G
Recital G
G. whereas upholding the law in the digital world does not only involve effective enforcement of rights, but also, in particular, ensuring access to justice for all; whereas delegation of the taking of decisions regarding the legality of content or of law enforcement powers to private companies can undermine the right to a fair trial and risks not to provide an effective remedyfundamental rights, in particular, freedom of expression and information, privacy, safety and security, non-discrimination, respect to property, including intellectual property rights and ensuring access to justice for all;
Amendment 57 #
Motion for a resolution
Recital H
Recital H
H. whereas content hosting platforms often employ automated content removal mechanisms that in some cases can raise legitimate rule of law concerns, in particular when they are not encouraged by Union laws to employ such mechanisms pro-actively and voluntarily, resulting in content removal takingwould take place without a clear legal basis, which is in contravention of Article 10 of the European Convention on Human Rights, stating that formalities, conditions, restrictions or penalties governing the exercise of freedom of expression and information must be prescribed by law;
Amendment 67 #
Motion for a resolution
Recital J
Recital J
J. whereas the current business model of certain content hosting platforms is to promote content that is likely to attract the attention of users and therefore generate more profiling data in order to offer more effective targeted advertisements and thereby increase profit; whereas this profiling coupled with targeted advertisement oftecan leads to the amplification of content based on addressinggeared to emotions, often giving rise to the fuelling of sensationalism in news feed and recommendation systems;
Amendment 106 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Considers that anythe final decision on the legality of user-generated content must be made by an independent judiciary and not a private commercial entity;
Amendment 130 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Suggests that content hosting platforms regularly submit transparency reports to the European Agency, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms publish theiris report including the number of decisions on removing user-generated content on a publicly accessible database;
Amendment 146 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Takes the firm position that the Digital Services Act must not containshall avoid provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, unless otherwise specified in existing legal texts, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Act;
Amendment 286 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 1 a (new)
Annex I – part A – part I – section 2 – indent 4 – subi. 1 a (new)
- failure to implement any other obligations with regard to content moderation;
Amendment 313 #
Motion for a resolution
Annex I – part A – part I – section 3 –– indent 1 – subi. 1
Annex I – part A – part I – section 3 –– indent 1 – subi. 1
- the total number of notices received and the action taken accordingly,
Amendment 362 #
Motion for a resolution
Annex I – part B – recital 4
Annex I – part B – recital 4
(4) Given the detrimental effects of the fragmentation of the digital Single Market, the international character of content hosting, the great amount of illegal or harmful uploaded content and the dominant position of a few content hosting platforms located outside the Union, the various issues that arise in respect of content hosting need to be regulated in a manner that entails full harmonisation and therefore by means of a regulation;
Amendment 379 #
Motion for a resolution
Annex I – part B – recital 9
Annex I – part B – recital 9
(9) This Regulation should not containavoid provisions forcing content hosting platforms to employ any form of fully automated ex- ante control of content unless otherwise specified in existing legal texts.
Amendment 389 #
Motion for a resolution
Annex I – part B – recital 12
Annex I – part B – recital 12
(12) After a notice has been issued, the uploader should be informed by the hosting platform about it and in particular about the reason for the notice and for the action taken, be provided information about the procedure, including about appeal and referral to independent dispute settlement bodies, and about available remedies in the event of false notices. Such information should, however, not be given if the content hosting platform has been informed by public authorities about ongoing law enforcement investigations. In such case, it should be for the relevant authorities to inform the uploader about the issue of a notice, in accordance with applicable rules.
Amendment 444 #
Motion for a resolution
Annex I – part B – Article 6 –point c
Annex I – part B – Article 6 –point c
(c) the deadline for the content hosting platform to expeditiously treat a notice and take a decision;
Amendment 445 #
Motion for a resolution
Annex I – part B – Article 6 –point d
Annex I – part B – Article 6 –point d
(d) the deadline for the content hosting platform to inform both parties about the outcome of the decision. which should include a justification for the action taken
Amendment 450 #
Motion for a resolution
Annex I – part B – Article 8 –introductory part
Annex I – part B – Article 8 –introductory part
Upon a notice being issued, and before any decision on the content has been made, the uploader of the content in question shall receive the following information by the content hosting platform:
Amendment 451 #
Motion for a resolution
Annex I – part B – Article 8 –– point a
Annex I – part B – Article 8 –– point a
(a) the reason for the notice and for the action taken;
Amendment 459 #
Motion for a resolution
Annex I – part B – Article 9 – point 2
Annex I – part B – Article 9 – point 2
2. Following a notice, content hosting platforms shall decide to remove, take down or make invisible content that was the subject of a notice, if such content does not comply with legal and regulatory requirements, community guidelines or terms and conditions.
Amendment 460 #
Motion for a resolution
Annex I – part B – Article 9 – point 2 a (new)
Annex I – part B – Article 9 – point 2 a (new)
2a. Content hosting platforms shall put in place measures to ensure that previously notified and removed content is not reuploaded online, through the establishment of a clear stay down obligation.
Amendment 472 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1
Annex I – part B – Article 12 – paragraph 1
Without prejudice to judicial or administrative orders regarding content online, content that has been theaken down subject to a valid notice related to the legality of the content, shall remain down until a final decision has been taken in case of appeal by the uploader. On the contrary, content that has been subject tof a notice related to its harmfulness, shall remain visible until a final decision has been taken regarding its removal or takedown.