26 Amendments of Moritz KÖRNER related to 2020/2022(INI)
Amendment 50 #
Motion for a resolution
Recital G
Recital G
G. whereas a pure self-regulatory approach of platforms doesmay not provide adequate transparency to public authorities, civil society and users on how platforms address illegal and harmful content; whereas such an approach doesmay not guarantee compliance with fundamental rights;
Amendment 74 #
Motion for a resolution
Paragraph -1 (new)
Paragraph -1 (new)
-1. Stresses that the reform of the current liability regime for digital service providers must be proportionate, must not disadvantage small and medium sized companies, and must not limit innovation, access to information, and freedom of expression.
Amendment 83 #
Motion for a resolution
Paragraph 1
Paragraph 1
1. Stresses that illegal content online should be tackled with the same rigour and based on the same legal principles as illegal content offline;
Amendment 86 #
Motion for a resolution
Paragraph 1 a (new)
Paragraph 1 a (new)
1a. Is convinced that it is solely the task of democratically accountable competent public authorities to decide on the legality of content online.
Amendment 88 #
Motion for a resolution
Paragraph 1 b (new)
Paragraph 1 b (new)
1b. Stresses that digital service providers must only be mandated to take their users’ content offline based on sufficiently substantiated orders by democratically accountable competent public authorities.
Amendment 98 #
Motion for a resolution
Paragraph 2 a (new)
Paragraph 2 a (new)
2a. Is convinced that digital service providers must not retain data for law enforcement purposes unless a targeted retention of an individual user’s data is directly ordered by a democratically accountable competent public authority in line with Union law.
Amendment 100 #
Motion for a resolution
Paragraph 2 b (new)
Paragraph 2 b (new)
2b. Requests that digital services should to the maximum extent possible be accessible without the need for users to reveal their identity.
Amendment 102 #
Motion for a resolution
Paragraph 2 c (new)
Paragraph 2 c (new)
2c. Reiterates that digital service providers must respect and enable their users’ right to data portability as laid down in Union law.
Amendment 110 #
Motion for a resolution
Paragraph 3 a (new)
Paragraph 3 a (new)
3a. Calls on digital service providers to take content offline in a diligent, proportionate and non-discriminatory manner, and with due regard in all circumstances to the fundamental rights of the users and to take into account the fundamental importance of the freedom of expression and information in an open and democratic society with a view to avoiding the removal of content, which is not illegal. Requests digital service providers, which on their own initiative want to restrict certain legal content of their users, to explore the possibility of labelling rather than taking offline that content, giving users the chance to self- responsibly choose to access that content.
Amendment 131 #
Motion for a resolution
Paragraph 6 a (new)
Paragraph 6 a (new)
6a. Is convinced that digital service providers must not be mandated to apply one Member State’s national restrictions on freedom of speech in another Member State where that restriction does not exist.
Amendment 150 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by requiring digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to remove illegal content in line with European values and law; firmly believes that this should be harmonised within the digital single market;
Amendment 151 #
Motion for a resolution
Paragraph 10
Paragraph 10
10. Believes, in this regard, that online platforms that are actively hosting or moderating content should bear more, yet proportionate, responsibility for the infrastructure they provide and the content on it; emphasises that this should be achieved without resorting to general monitoring requirements;
Amendment 156 #
Motion for a resolution
Paragraph 10 a (new)
Paragraph 10 a (new)
10a. Stresses that public authorities must not impose any obligation on digital service providers, neither de jure nor de facto, to monitor the information which they transmit or store, nor a general obligation to seek, moderate or filter content indicating illegal activity.
Amendment 157 #
Motion for a resolution
Paragraph 10 b (new)
Paragraph 10 b (new)
10b. Is convinced that digital service providers must not be required to prevent the upload of illegal content. Believes at the same time, where technologically feasible, based on sufficiently substantiated orders by democratically accountable competent public authorities, and taking full account of the specific context of the content, that digital service providers may be required to execute periodic searches for distinct pieces of content, which, in line with the ECJ Judgment in Case C-18/18, are identical or equivalent to content that a court had already declared unlawful, and to take that content offline.
Amendment 163 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Highlights that this should include rules on the notice-and-action mechanisms and requirements for platforms to take proacExpects digital service providers to establish fair and transparent notivce measures that are proportionate to their scale of reach and operational capacities in order to address the appearance of illegal content on their services; supports a balanced duty-of-care approach and a clear chain of responsibility tochanisms, which empower users to notify the relevant democratically accountable competent public authorities of potentially illegal content. Highlights that this should avoid unnecessary regulatory burdens for the platforms and unnecessary and disproportionate restrictions on fundamental rights, including the freedom of expression;
Amendment 170 #
Motion for a resolution
Paragraph 12
Paragraph 12
12. Stresses the need for appropriate safeguards and due process obligations, including human oversight and verification, in addition to counter notice procedures, to ensure that removal or blocking decisionsdecisions to take content offline are accurate, well- founded and respect fundamental rights; recalls that the possibility of judicial redress should be made available to satisfy the right to effective remedy, believes that all decisions to take users’ content offline must be subject to human oversight and verification;
Amendment 179 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Supports limited liability for content and the country of origin principle, but considers improved coordination for removal requests between national competent authorities to be essential; emphasises that such orders should be subject to legal safeguards in order to prevent abuse and ensure full respect of fundamental rights; stresses that sanctions should apply to those service providers that fail to comply with legitimateawful orders;
Amendment 182 #
Motion for a resolution
Paragraph 13 a (new)
Paragraph 13 a (new)
13a. Requires digital service providers that become aware of alleged illegal content of their users to notify the competent public authorities without undue delay.
Amendment 184 #
Motion for a resolution
Paragraph 13 b (new)
Paragraph 13 b (new)
13b. Requests Member States and digital service providers to put in place transparent, effective, fair, and expeditious complaint and redress mechanisms to allow users to challenge the taking offline of their content.
Amendment 185 #
Motion for a resolution
Paragraph 13 c (new)
Paragraph 13 c (new)
Amendment 186 #
Motion for a resolution
Paragraph 13 d (new)
Paragraph 13 d (new)
13d. Believes that neither infrastructure service providers, payment providers, and other companies offering services to digital service providers, nor digital service providers with a direct relationship with the user must be held liable for the content a user on his own initiative uploads or downloads. Believes at the same time that digital service providers, which have a direct relationship with the user who uploaded the illegal content and which have the ability to take distinct pieces of the user’s content offline, must be held liable for failing to expeditiously respond to sufficiently substantiated orders by democratically accountable competent public authorities to take the illegal content offline.
Amendment 200 #
Motion for a resolution
Paragraph 15
Paragraph 15
15. Underlines that certain types of legal, yet potentially harmful, content should also be addressed to ensure a fair digital ecosystem; expects guidelines to include increased transparency rules on content moderation or political advertising policy to ensure that removals and the blocking of potentially harmful content are limited to the absolute necessaryavoided;
Amendment 207 #
Motion for a resolution
Paragraph 15 a (new)
Paragraph 15 a (new)
15a. Calls on digital service providers to take the necessary measures to identify and label content uploaded by social bots.
Amendment 231 #
Motion for a resolution
Paragraph 20
Paragraph 20
20. Supports the creation of an independent EU body to exercise effective oversight of compliance with the applicable rules; believes that it should enforce procedural safeguards and transparency and provide quick and reliable guidance on contexts in which legal content is to be considered harmful; Calls on the Commission to set up, similarly to the European system of financial supervision (ESFS), a European system of digital services' supervision and to task existing EU agencies and competent national supervisory authorities to audit digital service providers’ internal policies and algorithms with due regard to Union law and in all circumstances to the fundamental rights of the services’ users, taking into account the fundamental importance of non-discrimination and the freedom of expression and information in an open and democratic society, and without publishing commercially sensitive data. Requests that this European system of digital services' supervision ensures that the rules applicable to digital service providers are adequately implemented and enforced across Member States in order to provide protection for the services’ users and to facilitate a European digital single market.
Amendment 260 #
Motion for a resolution
Paragraph 23 a (new)
Paragraph 23 a (new)
23a. Requests that digital services providers to the maximum extent possible give their users the possibility to choose which content they want to be presented and in which order.
Amendment 264 #
Motion for a resolution
Paragraph 23 b (new)
Paragraph 23 b (new)
23b. Requests, based on the principles above, that the Digital Services Act harmonises and replaces the liability measures laid down in the Digital Single Market Copyright Directive, the Audiovisual Media Services Directive and the Terrorist Content Online Regulation.