Activities of József SZÁJER related to 2020/2019(INL)
Shadow reports (1)
REPORT with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online
Amendments (104)
Amendment 4 #
Motion for a resolution
Citation 7 a (new)
Citation 7 a (new)
- having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 25 May 2016 on Online Platforms and the Digital Single Market - Opportunities and Challenges for Europe (COM(2016)288),
Amendment 5 #
Motion for a resolution
Citation 7 b (new)
Citation 7 b (new)
- having regard to the Recommendation of the Commission of 1 March 2018 on measures to effectively tackle illegal content online (C(2018) 1177),
Amendment 6 #
Motion for a resolution
Citation 7 c (new)
Citation 7 c (new)
- having regard to the Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market,
Amendment 7 #
Motion for a resolution
Citation 7 d (new)
Citation 7 d (new)
- having regard to the Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down bylaw, regulation or administrative action in Member States concerning the provision of audiovisual media services,
Amendment 8 #
Motion for a resolution
Citation 7 e (new)
Citation 7 e (new)
- having regard to the Directive2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography,
Amendment 9 #
Motion for a resolution
Citation 7 f (new)
Citation 7 f (new)
- having regard to the Directive (EU) 2017/541/EU of the European Parliament and of the Council of 15 March 2017 on combating terrorism,
Amendment 12 #
Motion for a resolution
Recital A
Recital A
A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that balances central concerns like respect for fundamental rights and other rights of citizens, with the need to support development and economic progress, taking into account the interests of users and all market participants, with particular regard to small businesses, SMEs and start-ups;
Amendment 22 #
Motion for a resolution
Recital B a (new)
Recital B a (new)
Ba. whereas digital services are used by the majority of Europeans on a daily basis, but are subject to an increasingly wide set of rules across the EU leading to significant fragmentation on the market and consequently legal uncertainty for European users and services operating cross-borders, combined with lack of regulatory control on key aspects of today's information environment;
Amendment 48 #
Motion for a resolution
Recital F
Recital F
F. whereas content hostertaing platforms may determine what content is shown to their users, thereby profoundly influencing the way we obtain and communicate information, to the point that content hosting platforms have de facto become public spaces in the digital sphere; whereas public spaces must be managed in a manner that respects fundamental rights and the civil lawthe rights of the users;
Amendment 49 #
Motion for a resolution
Recital G
Recital G
G. whereas upholding the law in the digital world does not only involve effective enforcement of rights, but also, in particular, ensuring access to justice for all; whereas delegation of the taking of decisions regarding the legality of content or of law enforcement powers to private companies can undermine the right to a fair trial and risks not to provide an effective remedy; whereas taking of decisions of digital service providers should be complemented by a fast-track legal procedure with adequate guarantees;
Amendment 54 #
Motion for a resolution
Recital H
Recital H
Amendment 60 #
Motion for a resolution
Recital H a (new)
Recital H a (new)
Ha. whereas automated content removal mechanisms of digital service providers should be proportionate, covering only those justified cases, where the benefits of removing content outweigh the potential disadvantages of keeping content online; whereas these procedures should be also transparent and their terms and conditions should be made known prior to the users would use the service;
Amendment 81 #
Motion for a resolution
Paragraph 1
Paragraph 1
1. Requests that the Commission submit without undue delay a set of legislative proposals comprising a Digital Services Act with a wide material, personal and territorial scope, including the recommendations as set out in the Annex to this resolution; considers that, without prejudice to detailed aspects of the future legislative proposals, Article 114 of the Treaty on the Functioning of the European Union should be chosen as the legal basis;
Amendment 86 #
Motion for a resolution
Paragraph 2
Paragraph 2
2. Proposes that the Digital Services Act include a regulation that establishes contractual rights as regards content management, lays down transparent, binding and uniform standards and procedures for content moderation, andprovide digital service providers with a clear and up-to-date innovation friendly regulatory framework, protect users when accessing digital services, guarantees accessible and independent recourse to judicial redress and ensure the necessary cooperation among Member States;
Amendment 93 #
Motion for a resolution
Paragraph 2 a (new)
Paragraph 2 a (new)
2a. Proposes that the Digital Services Act follow a sector and problem-specific approach and make a clear distinction between illegal and harmful content when elaborating the appropriate policy options;
Amendment 96 #
Motion for a resolution
Paragraph 2 b (new)
Paragraph 2 b (new)
2b. Underlines that any new framework established in the Digital Services Act should be manageable for small businesses, SMEs and start-ups and should therefore include proportionate obligations and clear safeguards for all sectors;
Amendment 97 #
Motion for a resolution
Paragraph 2 c (new)
Paragraph 2 c (new)
2c. Proposes that the Digital Services Act introduces enhanced transparency rules for social media platforms in order to disclose the funding and the power of interest groups behind those using the digital services in order to show who is legally responsible for the content;
Amendment 98 #
Motion for a resolution
Paragraph 2 d (new)
Paragraph 2 d (new)
2d. Proposes that the Digital Services Act set the obligation for digital service providers without a permanent establishment in the EU to designate a legal representative for the interest of users within the European Union and to make the contact information of this representative visible and accessible on its website;
Amendment 99 #
Motion for a resolution
Paragraph 2 e (new)
Paragraph 2 e (new)
2e. Underlines the importance that online platforms hosting or moderating content online should bear more responsibility for the content they host and should act in order to proactively prevent illegality;
Amendment 105 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Considers that following the actions of digital service providers any final decision on the legality of user- generated content must be made by an independent judiciary and not a private commercial entity;
Amendment 110 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Insists that the regulation must proscribe content moderation practices that are discriminatoryproportionate or unduly go beyond the purpose of protection under the law;
Amendment 118 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Recommends the establishment of a European Agency tasked with monitoring and enforcing compliance with contractual rights as regards content management, auditing any algorithms used fornetwork of national authorities tasked with monitoring the practice of automated content moderationfiltering and curation, and imposing penalties for non-compliancereporting to the EU institutions;
Amendment 129 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Suggests that content hosting platformdigital service providers regularly submit transparency reports to the European Agencynetwork of national authorities and the European Commission, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms publish, statistics and data related to the automated content filtering and their decisions on removing user- generated content on a publicly accessible database;
Amendment 138 #
Motion for a resolution
Paragraph 7
Paragraph 7
7. RecommendConsiders the establishment of independent dispute settlement bodies in the Member States, tasked with settling disputes regarding content moderation;
Amendment 149 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Actdigital service providers to employ automated filtering mechanism that goes beyond the level of protection required by the law, however encourages digital service providers to employ such a mechanism in order to combat against illegal content online;
Amendment 153 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Considers that the user-targeted amplification of content based on the views or positions presented in such content is one of a practice on which further most detrimental practices in the digital society, especially innitoring might be required therefore the Commission should pay attention to and analysis the impact of cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements;
Amendment 157 #
Motion for a resolution
Paragraph 10
Paragraph 10
Amendment 161 #
Motion for a resolution
Paragraph 10 a (new)
Paragraph 10 a (new)
10a. Notes however that targeted advertising is currently ruled by the General Data Protection Regulation which as to be properly enforced in the Union before any new legislation in this field would be considered;
Amendment 163 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Recommends, therefore, that the Digital Services Act set clear boundaries as regards the terms for accumulation of data for the purpose ofintroduces rules in order to enhance transparency related to targeted advertising, especially when data are tracked on third party websites;
Amendment 170 #
Motion for a resolution
Paragraph 12
Paragraph 12
Amendment 192 #
Motion for a resolution
Paragraph 15 a (new)
Paragraph 15 a (new)
15a. Suggests to create a common understanding on what constitutes false or misleading advertisement;
Amendment 196 #
Motion for a resolution
Paragraph 16
Paragraph 16
16. Regrets the existing information asymmetry between content hosting platforms and public authorities and calls for a streamlined exchange of necessary informationCalls for a streamlined exchange of necessary information between digital service providers and public authorities;
Amendment 207 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Strongly recommends that smart contracts include mechanisms that can halt their execution, in particular to take account of concerns of weaker parties and to ensure that the rights of creditors in insolvency and restructuring are respectedConsiders that necessary steps should be taken in order to ensure equality between the parties in case of smart contracts for which the Commission should examine the modalities;
Amendment 212 #
Motion for a resolution
Subheading 5
Subheading 5
Amendment 213 #
Motion for a resolution
Paragraph 19
Paragraph 19
Amendment 216 #
Motion for a resolution
Paragraph 20
Paragraph 20
Amendment 219 #
Motion for a resolution
Paragraph 21
Paragraph 21
Amendment 234 #
Motion for a resolution
Annex I – part A – introductory part – indent 6
Annex I – part A – introductory part – indent 6
Amendment 235 #
Motion for a resolution
Annex I – part A – introductory part – indent 7
Annex I – part A – introductory part – indent 7
- The proposal addresraises the necessity for the proper regulation of civil and commercial law aspectsed for assessment in the field of distributed ledger technologies, including block chains and, in particular, smart contracts.
Amendment 237 #
Motion for a resolution
Annex I – part A – introductory part – indent 8
Annex I – part A – introductory part – indent 8
- The proposal raises the importance of pbrivate international law rules that provide legal clarity and certainty with respect tonging clarity on the non-negotiable terms and conditions used by online platforms and, ensure the rights to access to data soand guarantee thate access to justice is appropriately guaranteed.
Amendment 238 #
Motion for a resolution
Annex I – part A – part I – introductory part
Annex I – part A – part I – introductory part
The key elements of the proposals to be included in the Digital Services Act should beDigital Services Act should reflect among others the following elements of the proposals, on the basis of a proper public consultation and impact analysis:
Amendment 239 #
Motion for a resolution
Annex I – part A – part I – section 1 –introductory part
Annex I – part A – part I – section 1 –introductory part
A regulation ‘on contractual rights as regards content management’ and that contains the following elements:
Amendment 241 #
Motion for a resolution
Annex I – part A – part I – section 1 –– indent 1 a (new)
Annex I – part A – part I – section 1 –– indent 1 a (new)
- It should build upon the home state control principle, by updating its scope in light of the increasing convergence of user protection.
Amendment 242 #
Motion for a resolution
Annex I – part A – part I – section 1 –– indent 1 b (new)
Annex I – part A – part I – section 1 –– indent 1 b (new)
- It should make a clear distinction between illegal and harmful content when it comes to applying the appropriate policy options.
Amendment 243 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 1 c (new)
Annex I – part A – part I – section 1 –indent 1 c (new)
- It should avoid extending its scope that would conflict with existing sectorial rules already in force such as the Copyright Directive or other existing European law in the media and audio- visual field.
Amendment 244 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 2
Annex I – part A – part I – section 1 –indent 2
- It should provide principles for content moderation, including as regards discriminatory content moderation practices.
Amendment 250 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 3
Annex I – part A – part I – section 1 –indent 3
- It should provide formal and procedural standards for a notice and action system by following a sector-specific approach.
Amendment 254 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 4
Annex I – part A – part I – section 1 –indent 4
- It should provide rules for an independent dispute settlement mechanism by respecting the national competences of the Member States.
Amendment 258 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 5 a (new)
Annex I – part A – part I – section 1 –indent 5 a (new)
- It should provide rules regarding the responsibility of content hosting platforms for goods sold or advertised on them taking into account supporting activities for SMEs in order to minimize their burden when adapting to this responsibility.
Amendment 264 #
Motion for a resolution
Annex I – part A – part I – section 2 – introductory part
Annex I – part A – part I – section 2 – introductory part
A European Agency on Content Managementnetwork of national authorities should be established with the following main tasks:
Amendment 267 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1
Annex I – part A – part I – section 2 – indent 1
- regular auditmonitoring of the algorithms employed by content hosting platforms for the purpose of content moderation as well as curation;
Amendment 269 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1 a (new)
Annex I – part A – part I – section 2 – indent 1 a (new)
- regular monitoring the practice of automated content filtering and curation, and reporting to the EU institutions;
Amendment 272 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 2
Annex I – part A – part I – section 2 – indent 2
- regular review of the compliance of content hosting platforms with the Regulation and other provisions that form part of the Digital Services Act, in particular as regards the correct implementation of the standards for notice-and-action procedures and content moderation in their terms and conditions, on the basis of transparency reports provided by the content hosting platforms and the public database of decisions on removal of content to be established by the Digital Services Act;
Amendment 275 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 3 a (new)
Annex I – part A – part I – section 2 – indent 3 a (new)
- cooperate and coordinate with the national authorities of Member States related to the implementation of the Digital Services Act.
Amendment 279 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – introductory part
Annex I – part A – part I – section 2 – indent 4 – introductory part
- imposing fines for non-compliance with the Digital Services Act. Fines should be set at up to 4% of the total worldwide annual turnover of the content hosting intermediary and take into account the platform’s overall compliance with the Digital Services Act. The fines should contribute to a special dedicated fund intended to finance the operating costs of the dispute settlement bodies described in the Regulation. Instances of non- compliance should include:reporting to the Commission detected non-compliance with the rules established by the Digital Services Act including publishing biannual reports on all of its activities.
Amendment 282 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 1
Annex I – part A – part I – section 2 – indent 4 – subi. 1
Amendment 287 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 2
Annex I – part A – part I – section 2 – indent 4 – subi. 2
Amendment 292 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 3
Annex I – part A – part I – section 2 – indent 4 – subi. 3
Amendment 299 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 4
Annex I – part A – part I – section 2 – indent 4 – subi. 4
Amendment 306 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 5
Annex I – part A – part I – section 2 – indent 4 – subi. 5
Amendment 309 #
Motion for a resolution
Annex I – part A – part I – section 3 –– introductory part
Annex I – part A – part I – section 3 –– introductory part
The Digital Services Act should contain provisions requiring content hosting platforms to regularly provide transparency reports to the AgencyCommission and the network of national authorities. Such reports should, in particular, include:
Amendment 321 #
Motion for a resolution
Annex I – part A – part II – section 1 – introductory part
Annex I – part A – part II – section 1 – introductory part
Measures regarding content curation, data and online advertisements in breach of fair contractual rights of users should include:
Amendment 335 #
Motion for a resolution
Annex I – part A – part II – section 2
Annex I – part A – part II – section 2
Amendment 337 #
Motion for a resolution
Annex I – part A – part II – section 2 – indent 1
Annex I – part A – part II – section 2 – indent 1
Amendment 347 #
Motion for a resolution
Annex I – part A – part II – section 3 – indent 1
Annex I – part A – part II – section 3 – indent 1
- measures ensuring that the proper legislative framework is in place for the development and deployment of digital services making use ofincluding distributed ledger technologies, including such as block chains, and in particular for smart contracts,
Amendment 348 #
Motion for a resolution
Annex I – part A – part II – section 3 – indent 2
Annex I – part A – part II – section 3 – indent 2
Amendment 350 #
Motion for a resolution
Annex I – part A – part II – section 3 – indent 2 a (new)
Annex I – part A – part II – section 3 – indent 2 a (new)
- measures to ensure equality between the parties in case of smart contracts, taking into account in particular the interest of small businesses and SMEs, for which the Commission should examine possible modalities.
Amendment 351 #
Motion for a resolution
Annex I – part A – part II – section 4
Annex I – part A – part II – section 4
Amendment 352 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 1
Annex I – part A – part II – section 4 – indent 1
Amendment 356 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 2
Annex I – part A – part II – section 4 – indent 2
Amendment 358 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 3
Annex I – part A – part II – section 4 – indent 3
Amendment 359 #
Motion for a resolution
Annex I – part A – part II – section 4– final part
Annex I – part A – part II – section 4– final part
Amendment 360 #
Motion for a resolution
Annex I – part B – recital 1
Annex I – part B – recital 1
(1) The terms and conditions that providers of information society servicedigital service providers apply in relations with users are often non- negotiable and can be unilaterally amended by those providers. Action at a legislative level is needed to put in place minimum standards for such terms and conditions, in particular as regards procedural standards for content management;
Amendment 368 #
Motion for a resolution
Annex I – part B – recital 5
Annex I – part B – recital 5
(5) Concerning relations with users, this Regulation should lay down minimum standards for the transparency and accountability of terms and conditions of content hosting platforms. Terms and conditions should include transparent, binding and uniform standards and procedures for content moderation, which should guarantee accessible and independent recourse to judicial redress.
Amendment 369 #
Motion for a resolution
Annex I – part B – recital 6
Annex I – part B – recital 6
Amendment 376 #
Motion for a resolution
Annex I – part B – recital 8 a (new)
Annex I – part B – recital 8 a (new)
(8a) Far too many goods sold online do not follow safety standards. One way of ensuring that content hosting platforms perform due diligence checks of goods sold by it or through it is to make the platforms jointly and severally responsible together with the primary seller. This would not be unreasonable for the content hosting platforms given that they take a share of the proceeds. Special attention should be paid to enable small and medium sized platforms to perform these checks and any supporting activity such as standardisation should ensure that administrative burdens are kept to a minimum.
Amendment 377 #
Motion for a resolution
Annex I – part B – recital 9
Annex I – part B – recital 9
Amendment 383 #
Motion for a resolution
Annex I – part B – recital 9 a (new)
Annex I – part B – recital 9 a (new)
(9a) This Regulation does not prevent platforms from using an automated content mechanism where necessary and justified, and in particular promotes the use of such mechanism in the case the illegal nature of the content has either been established by a court or it can be easily determined without contextualisation.
Amendment 384 #
Motion for a resolution
Annex I – part B – recital 10
Annex I – part B – recital 10
(10) This Regulation should also include provisions against discriminatory content moderation practices, especially when user-created content is removed based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social clasunjustified content moderation practices.
Amendment 386 #
Motion for a resolution
Annex I – part B – recital 11
Annex I – part B – recital 11
(11) The right to issue a notice pursuant to this Regulation should remain with any natural or legal person, including public bodies, to which content is provided through a website or application. A content hosting platform should, however, be able to block a user who repeatedly issues false notices from issuing further notices.
Amendment 388 #
Motion for a resolution
Annex I – part B – recital 12
Annex I – part B – recital 12
Amendment 391 #
Motion for a resolution
Annex I – part B – recital 14
Annex I – part B – recital 14
(14) Given the immediate nature of content hosting and the often ephemeral purpose of content uploading, it is necessary to establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse. Such bodies should be competent to adjudicate disputes concerning the legality of user-uploaded content and the correct application of terms and conditionsrecourse, however such process should not prevent the user right of access to justice.
Amendment 398 #
Motion for a resolution
Annex I – part B – recital 17
Annex I – part B – recital 17
(17) As regards jurisdiction, the competent independent dispute settlement body should be that located in the Member State in which the content forming the subject of the dispute has been uploadedCircumstances on the basis of which jurisdiction should be established must be in the interests of the users, so that both the place where the content has been uploaded and downloaded shall be deemed to constitute a ground of jurisdiction.
Amendment 399 #
Motion for a resolution
Annex I – part B – recital 18
Annex I – part B – recital 18
Amendment 401 #
Motion for a resolution
Annex I – part B – recital 20
Annex I – part B – recital 20
(20) Since the objective of this Regulation, namely to establish a regulatory framework for contractual rights as regards content management in the Union, cannot be sufficiently achieved by the Member States but can rather, by reason of its scale and effects can , be better achieved at Union level, the Union may adopt measures, in accordance with the principle of subsidiarity as set out in Article 5 of the Treaty on European Union. In accordance with the principle of proportionality, as set out in that Article, this Regulation does not go beyond what is necessary in order to achieve that objective.
Amendment 402 #
Motion for a resolution
Annex I – part B – recital 21
Annex I – part B – recital 21
Amendment 409 #
Motion for a resolution
Annex I – part B – Article 1 – paragraph 1
Annex I – part B – Article 1 – paragraph 1
The purpose of this Regulation is to contribute to the proper functioning of the internal market by laying down rules to ensure that fair contractual rights exist as regards content management andprovide digital services providers with a clear, uniform, and up-to-date innovation friendly regulatory framework in the Single Market, to provide independent dispute settlement mechanisms for disputes regarding content managementtect, enable, and empower users when accessing digital services and to ensure the necessary cooperation among Member States in order to have an oversight of digital service providers in the EU.
Amendment 411 #
Motion for a resolution
Annex I – part B – Article 2 – paragraph 1
Annex I – part B – Article 2 – paragraph 1
This Regulation applies to the management by content hosting platforms of content that isproviders offering digital service accessible on websites or through smart phone applications in the Union, irrespective of the place of establishment or registration, or principal place of business of the content hosting platform., in particular online platforms such as social media, search engines, online marketplaces or collaborative economy services
Amendment 415 #
Motion for a resolution
Annex I – part B – Article 3 –point 1
Annex I – part B – Article 3 –point 1
(1) ‘content hosting platform’ means an provider of information society service within the meaning of point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council1 of whichs consisting of the storage of information provided by the recipient of the service at his or her request, within the maeaning or one of the main purposes is to allow signed-up or non- signed-up users to upload content for display on a website or applicatf Article 14 of Directive 2000/31/EC ,irrespective of its place of establishment, which directs its activities to users residing in the Union; __________________ 1 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
Amendment 417 #
Motion for a resolution
Annex I – part B – Article 3 –point 2
Annex I – part B – Article 3 –point 2
(2) ‘'illegal content’' means any concept, idea, expression or information in any format such as text, images, audio and videoinformation which is not in compliance with Union law or the law of a Member State concerned;
Amendment 426 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 1
Annex I – part B – Article 4 – paragraph 1
1. Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall be appropriate, relevant and limiproportionated to what is necessary in relation to the purposes for which the content is managed.
Amendment 428 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2
Annex I – part B – Article 4 – paragraph 2
Amendment 434 #
Motion for a resolution
Annex I – part B – Article 4 a (new)
Annex I – part B – Article 4 a (new)
Amendment 435 #
Motion for a resolution
Annex I – part B – Article 4 a (new)
Annex I – part B – Article 4 a (new)
Article 4a Responsibility for goods 1. Any person procuring goods from a content hosting platform or through advertising on a platform shall have the right to pursue remedies against the platform if the person has pursued his or her remedies against the supplier but has failed to obtain the satisfaction to which he or she is entitled according to the law or the contract for the supply of goods. 2. The Commission should publish guidelines in particular for small and medium sized platforms in order to support them coping with their responsibility for goods and to ensure that administrative burdens are kept to a minimum. 3. A platform that has become liable according to this article shall have the right to be indemnified by the supplier.
Amendment 437 #
Motion for a resolution
Annex I – part B – Article 4 b (new)
Annex I – part B – Article 4 b (new)
Article 4b Transparency obligation 1. Digital services actively hosting or moderating online content shall take the necessary measures in order to disclose the funding and the power of interest groups behind those using their services so that the person legally responsible and accountable should be identifiable. 2. Digital service providers without a permanent establishment in the EU shall designate a legal representative for user interest within the European Union and make the contact information of this representative visible and accessible on their websites.
Amendment 441 #
Motion for a resolution
Annex I – part B – Article 5 – subparagraph 2
Annex I – part B – Article 5 – subparagraph 2
Amendment 469 #
Motion for a resolution
Annex I – part B – Article 12 – title
Annex I – part B – Article 12 – title
Stay-updown principle
Amendment 470 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1
Annex I – part B – Article 12 – paragraph 1
Amendment 474 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1 a (new)
Annex I – part B – Article 12 – paragraph 1 a (new)
Digital service providers should act expeditiously to make unavailable or remove illegal content that has been notified to them and make best efforts to prevent future uploads of the same content.
Amendment 476 #
Motion for a resolution
Annex I – part B – Article 13 – paragraph 1
Annex I – part B – Article 13 – paragraph 1
1. Member States shallmay establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse when decisions on content moderation are appealed against.
Amendment 478 #
Motion for a resolution
Annex I – part B – Article 13 – paragraph 4
Annex I – part B – Article 13 – paragraph 4
Amendment 483 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 3
Annex I – part B – Article 14 – paragraph 3
Amendment 485 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 3 a (new)
Annex I – part B – Article 14 – paragraph 3 a (new)
3a. Both the place where the content has been uploaded and accessed shall be deemed to constitute a ground of jurisdiction
Amendment 488 #
Motion for a resolution
Annex I – part B – Article 17
Annex I – part B – Article 17
Amendments to Directive (EU) 2019/1937 Directive (EU) 2019/1937 is amended as follows: (1) following point is added: “(xi) online content management;”; (2) following point is added: “K. Point (a)(xi) of Article 2(1) - online content management. Regulation [XXX] of the European Parliament and of the Council on contractual rights as regards content management.”.rticle 17 deleted in point (a) of Article 2(1), the in Part I of the Annex, the