133 Amendments of Jiří POSPÍŠIL related to 2020/2019(INL)
Amendment 4 #
Motion for a resolution
Citation 7 a (new)
Citation 7 a (new)
- having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 25 May 2016 on Online Platforms and the Digital Single Market - Opportunities and Challenges for Europe (COM(2016)288),
Amendment 5 #
Motion for a resolution
Citation 7 b (new)
Citation 7 b (new)
- having regard to the Recommendation of the Commission of 1 March 2018 on measures to effectively tackle illegal content online (C(2018) 1177),
Amendment 6 #
Motion for a resolution
Citation 7 c (new)
Citation 7 c (new)
- having regard to the Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market,
Amendment 7 #
Motion for a resolution
Citation 7 d (new)
Citation 7 d (new)
- having regard to the Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down bylaw, regulation or administrative action in Member States concerning the provision of audiovisual media services,
Amendment 8 #
Motion for a resolution
Citation 7 e (new)
Citation 7 e (new)
- having regard to the Directive2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography,
Amendment 9 #
Motion for a resolution
Citation 7 f (new)
Citation 7 f (new)
- having regard to the Directive (EU) 2017/541/EU of the European Parliament and of the Council of 15 March 2017 on combating terrorism,
Amendment 12 #
Motion for a resolution
Recital A
Recital A
A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that balances central concerns like respect for fundamental rights and other rights of citizens, with the need to support development and economic progress, taking into account the interests of users and all market participants, with particular regard to small businesses, SMEs and start-ups;
Amendment 22 #
Motion for a resolution
Recital B a (new)
Recital B a (new)
Ba. whereas digital services are used by the majority of Europeans on a daily basis, but are subject to an increasingly wide set of rules across the EU leading to significant fragmentation on the market and consequently legal uncertainty for European users and services operating cross-borders, combined with lack of regulatory control on key aspects of today's information environment;
Amendment 23 #
Motion for a resolution
Recital C
Recital C
C. whereas some businesses offering digital services could enjoy, due to strong data- driven network effects, market dominance that makes it increasingly difficult for other players to compete; this is the current under review; there is currently no clear basis in evidence that so-called network effects have led to a reduction in consumer choice or barriers to entry for new market entrants;
Amendment 29 #
Motion for a resolution
Recital D
Recital D
Amendment 46 #
Motion for a resolution
Recital F
Recital F
F. whereas content hosting platforms may determine what content is shown to their users, thereby profoundly influencing the way we obtain and communicate information, to the point that content hosting platforms have de facto become public spaces in the digital sphere; whereas public spaces must be managed in a manner that protects public health and user safety and respects fundamental rights and the civil law rights of the users;
Amendment 48 #
Motion for a resolution
Recital F
Recital F
F. whereas content hostertaing platforms may determine what content is shown to their users, thereby profoundly influencing the way we obtain and communicate information, to the point that content hosting platforms have de facto become public spaces in the digital sphere; whereas public spaces must be managed in a manner that respects fundamental rights and the civil lawthe rights of the users;
Amendment 49 #
Motion for a resolution
Recital G
Recital G
G. whereas upholding the law in the digital world does not only involve effective enforcement of rights, but also, in particular, ensuring access to justice for all; whereas delegation of the taking of decisions regarding the legality of content or of law enforcement powers to private companies can undermine the right to a fair trial and risks not to provide an effective remedy; whereas taking of decisions of digital service providers should be complemented by a fast-track legal procedure with adequate guarantees;
Amendment 54 #
Motion for a resolution
Recital H
Recital H
Amendment 58 #
Motion for a resolution
Recital H
Recital H
H. whereas content hosting platforms often employ automated content removal mechanisms that raise legitimate rule of law concerns, in particular when they are encouraged to employ such mechanismsWhereas, while removing unlawful content, content hosting platforms often cannot operate without mechanisms for the automatic detection and removal of content in other cases where content removal has no clear legal basis and is done pro-actively and voluntarily, resulting in content removal taking place without a clear legal basis, which is in contravention of; whereas the removal of content by content hosting platforms may raise legitimate concerns in terms of respect for the rule of law and compliance with Article 10 of the European Convention on Human Rights, stating that formalities, conditions, restrictions or penalties governing the exercise of freedom of expression and information must be prescribed by law;
Amendment 60 #
Motion for a resolution
Recital H a (new)
Recital H a (new)
Ha. whereas automated content removal mechanisms of digital service providers should be proportionate, covering only those justified cases, where the benefits of removing content outweigh the potential disadvantages of keeping content online; whereas these procedures should be also transparent and their terms and conditions should be made known prior to the users would use the service;
Amendment 61 #
Motion for a resolution
Recital H a (new)
Recital H a (new)
Ha. whereas Article 11 of the Charter also protects the freedom and pluralism of the media, which are increasingly dependent on online platforms to reach their audiences; whereas online platforms should not interfere with media content;
Amendment 81 #
Motion for a resolution
Paragraph 1
Paragraph 1
1. Requests that the Commission submit without undue delay a set of legislative proposals comprising a Digital Services Act with a wide material, personal and territorial scope, including the recommendations as set out in the Annex to this resolution; considers that, without prejudice to detailed aspects of the future legislative proposals, Article 114 of the Treaty on the Functioning of the European Union should be chosen as the legal basis;
Amendment 86 #
Motion for a resolution
Paragraph 2
Paragraph 2
2. Proposes that the Digital Services Act include a regulation that establishes contractual rights as regards content management, lays down transparent, binding and uniform standards and procedures for content moderation, andprovide digital service providers with a clear and up-to-date innovation friendly regulatory framework, protect users when accessing digital services, guarantees accessible and independent recourse to judicial redress and ensure the necessary cooperation among Member States;
Amendment 93 #
Motion for a resolution
Paragraph 2 a (new)
Paragraph 2 a (new)
2a. Proposes that the Digital Services Act follow a sector and problem-specific approach and make a clear distinction between illegal and harmful content when elaborating the appropriate policy options;
Amendment 96 #
Motion for a resolution
Paragraph 2 b (new)
Paragraph 2 b (new)
2b. Underlines that any new framework established in the Digital Services Act should be manageable for small businesses, SMEs and start-ups and should therefore include proportionate obligations and clear safeguards for all sectors;
Amendment 97 #
Motion for a resolution
Paragraph 2 c (new)
Paragraph 2 c (new)
2c. Proposes that the Digital Services Act introduces enhanced transparency rules for social media platforms in order to disclose the funding and the power of interest groups behind those using the digital services in order to show who is legally responsible for the content;
Amendment 98 #
Motion for a resolution
Paragraph 2 d (new)
Paragraph 2 d (new)
2d. Proposes that the Digital Services Act set the obligation for digital service providers without a permanent establishment in the EU to designate a legal representative for the interest of users within the European Union and to make the contact information of this representative visible and accessible on its website;
Amendment 99 #
Motion for a resolution
Paragraph 2 e (new)
Paragraph 2 e (new)
2e. Underlines the importance that online platforms hosting or moderating content online should bear more responsibility for the content they host and should act in order to proactively prevent illegality;
Amendment 105 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Considers that following the actions of digital service providers any final decision on the legality of user- generated content must be made by an independent judiciary and not a private commercial entity;
Amendment 110 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Insists that the regulation must proscribe content moderation practices that are discriminatoryproportionate or unduly go beyond the purpose of protection under the law;
Amendment 114 #
Motion for a resolution
Paragraph 4 a (new)
Paragraph 4 a (new)
4a. Insists that the rules must also proscribe platforms’ practices that interfere with media freedom and pluralism, in particular by prohibiting platforms from exercising a second layer of control over content that is provided under a media service provider’s responsibility and is subject to specific standards and oversight;
Amendment 118 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Recommends the establishment of a European Agency tasked with monitoring and enforcing compliance with contractual rights as regards content management, auditing any algorithms used fornetwork of national authorities tasked with monitoring the practice of automated content moderationfiltering and curation, and imposing penalties for non-compliancereporting to the EU institutions;
Amendment 122 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Recommends the establishment of a European Agency tasked with monitoring and enforcing compliance with contractual rights as regards content management, auditing any algorithms used for automated content moderation and curation, and imposing penalties for non-complianceat procedures be established to enable intensive cooperation between the Member States' authorities, the European Commission, the private sector, academia and civil society on content moderation and continuous improvement;
Amendment 127 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Suggests that content hosting platforms regularly submit transparency reports to the European Agency, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms make available in an easy and accessible manner, their content policies and publish their decisions on removing user-generated content on a publicly accessible database;
Amendment 129 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Suggests that content hosting platformdigital service providers regularly submit transparency reports to the European Agencynetwork of national authorities and the European Commission, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms publish, statistics and data related to the automated content filtering and their decisions on removing user- generated content on a publicly accessible database;
Amendment 131 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Suggests that content hosting platforms regularly submit transparency reports to the European Agency, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms publish their decisions on removing user-generated content on a publicly accessible database; provide aggregated data on content removal for the production of studies to help set appropriate content management rules; further suggests that the Commission has, based on available data, continuously evaluated the timeliness of content management rights, transparent, binding and uniform standards and content moderation procedures;
Amendment 138 #
Motion for a resolution
Paragraph 7
Paragraph 7
7. RecommendConsiders the establishment of independent dispute settlement bodies in the Member States, tasked with settling disputes regarding content moderation;
Amendment 139 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Act, where a platform uses a mechanism for fully automated ex-ante controls of content, it must be subject to checks on the legality of that mechanism;
Amendment 149 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Actdigital service providers to employ automated filtering mechanism that goes beyond the level of protection required by the law, however encourages digital service providers to employ such a mechanism in order to combat against illegal content online;
Amendment 151 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Considers that the user-targeted amplification of content based on the views or positions presented in such content is one of the most detrimental practices in the digital society, especially in cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements; Considers in this respect that new rules should, on top of bringing transparency and fairness, secure access to diverse and quality content in today’s digital environment and calls on the Commission to propose safeguards ensuring quality media content is easy to access and easy to find on third party platforms.
Amendment 153 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Considers that the user-targeted amplification of content based on the views or positions presented in such content is one of a practice on which further most detrimental practices in the digital society, especially innitoring might be required therefore the Commission should pay attention to and analysis the impact of cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements;
Amendment 157 #
Motion for a resolution
Paragraph 10
Paragraph 10
Amendment 161 #
Motion for a resolution
Paragraph 10 a (new)
Paragraph 10 a (new)
10a. Notes however that targeted advertising is currently ruled by the General Data Protection Regulation which as to be properly enforced in the Union before any new legislation in this field would be considered;
Amendment 163 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Recommends, therefore, that the Digital Services Act set clear boundaries as regards the terms for accumulation of data for the purpose ofintroduces rules in order to enhance transparency related to targeted advertising, especially when data are tracked on third party websites;
Amendment 164 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Recommends, therefore, that the Digital Services ActEuropean legal framework set clear boundaries as regards the terms for accumulation of data for the purpose of targeted advertising, especially when data are tracked on third party websites;
Amendment 170 #
Motion for a resolution
Paragraph 12
Paragraph 12
Amendment 174 #
Motion for a resolution
Paragraph 12
Paragraph 12
12. Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate data sharing and increase transparency with the aim of addressing imbalances in market power; suggests, to this end, to explore options to facilitate the interoperability and portability of data;
Amendment 179 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Calls for content hosting platforms to give users the choice of whether to consent to the use of targeted advertising based on the user’s prior interaction with content on the same content hosting platform or on third party websites; further calls on the platforms to create an advertising archive that is publicly accessible; further recommends that the platforms cooperate with fact checkers in order to indicate the misinformation present on a platform and possible further steps;
Amendment 192 #
Motion for a resolution
Paragraph 15 a (new)
Paragraph 15 a (new)
15a. Suggests to create a common understanding on what constitutes false or misleading advertisement;
Amendment 196 #
Motion for a resolution
Paragraph 16
Paragraph 16
16. Regrets the existing information asymmetry between content hosting platforms and public authorities and calls for a streamlined exchange of necessary informationCalls for a streamlined exchange of necessary information between digital service providers and public authorities;
Amendment 207 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Strongly recommends that smart contracts include mechanisms that can halt their execution, in particular to take account of concerns of weaker parties and to ensure that the rights of creditors in insolvency and restructuring are respectedConsiders that necessary steps should be taken in order to ensure equality between the parties in case of smart contracts for which the Commission should examine the modalities;
Amendment 212 #
Motion for a resolution
Subheading 5
Subheading 5
Amendment 213 #
Motion for a resolution
Paragraph 19
Paragraph 19
Amendment 216 #
Motion for a resolution
Paragraph 20
Paragraph 20
Amendment 219 #
Motion for a resolution
Paragraph 21
Paragraph 21
Amendment 234 #
Motion for a resolution
Annex I – part A – introductory part – indent 6
Annex I – part A – introductory part – indent 6
Amendment 235 #
Motion for a resolution
Annex I – part A – introductory part – indent 7
Annex I – part A – introductory part – indent 7
- The proposal addresraises the necessity for the proper regulation of civil and commercial law aspectsed for assessment in the field of distributed ledger technologies, including block chains and, in particular, smart contracts.
Amendment 237 #
Motion for a resolution
Annex I – part A – introductory part – indent 8
Annex I – part A – introductory part – indent 8
- The proposal raises the importance of pbrivate international law rules that provide legal clarity and certainty with respect tonging clarity on the non-negotiable terms and conditions used by online platforms and, ensure the rights to access to data soand guarantee thate access to justice is appropriately guaranteed.
Amendment 238 #
Motion for a resolution
Annex I – part A – part I – introductory part
Annex I – part A – part I – introductory part
The key elements of the proposals to be included in the Digital Services Act should beDigital Services Act should reflect among others the following elements of the proposals, on the basis of a proper public consultation and impact analysis:
Amendment 239 #
Motion for a resolution
Annex I – part A – part I – section 1 –introductory part
Annex I – part A – part I – section 1 –introductory part
A regulation ‘on contractual rights as regards content management’ and that contains the following elements:
Amendment 241 #
Motion for a resolution
Annex I – part A – part I – section 1 –– indent 1 a (new)
Annex I – part A – part I – section 1 –– indent 1 a (new)
- It should build upon the home state control principle, by updating its scope in light of the increasing convergence of user protection.
Amendment 242 #
Motion for a resolution
Annex I – part A – part I – section 1 –– indent 1 b (new)
Annex I – part A – part I – section 1 –– indent 1 b (new)
- It should make a clear distinction between illegal and harmful content when it comes to applying the appropriate policy options.
Amendment 243 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 1 c (new)
Annex I – part A – part I – section 1 –indent 1 c (new)
- It should avoid extending its scope that would conflict with existing sectorial rules already in force such as the Copyright Directive or other existing European law in the media and audio- visual field.
Amendment 244 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 2
Annex I – part A – part I – section 1 –indent 2
- It should provide principles for content moderation, including as regards discriminatory content moderation practices.
Amendment 247 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 2 a (new)
Annex I – part A – part I – section 1 –indent 2 a (new)
- The involvement of the scientific community should be enhanced so that the interests of the European academic community are taken into account when drafting a legislative act.
Amendment 249 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 3
Annex I – part A – part I – section 1 –indent 3
- It should provide formal and procedural standards for a deterrent and a notice -and -action systemechanism.
Amendment 250 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 3
Annex I – part A – part I – section 1 –indent 3
- It should provide formal and procedural standards for a notice and action system by following a sector-specific approach.
Amendment 252 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 3 a (new)
Annex I – part A – part I – section 1 –indent 3 a (new)
- It should assess the use of digital technology instruments for the deterrence of illegal content online
Amendment 254 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 4
Annex I – part A – part I – section 1 –indent 4
- It should provide rules for an independent dispute settlement mechanism by respecting the national competences of the Member States.
Amendment 260 #
Motion for a resolution
Annex I – part A – part I – section 2 – introductory part
Annex I – part A – part I – section 2 – introductory part
A European Agency on Content Management should be established with the following main tasksn independent European content management expert committee, composed of experts from the Member States, independent academics and experts on online platforms, should be established and should:
Amendment 264 #
Motion for a resolution
Annex I – part A – part I – section 2 – introductory part
Annex I – part A – part I – section 2 – introductory part
A European Agency on Content Managementnetwork of national authorities should be established with the following main tasks:
Amendment 266 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1
Annex I – part A – part I – section 2 – indent 1
Amendment 267 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1
Annex I – part A – part I – section 2 – indent 1
- regular auditmonitoring of the algorithms employed by content hosting platforms for the purpose of content moderation as well as curation;
Amendment 269 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1 a (new)
Annex I – part A – part I – section 2 – indent 1 a (new)
- regular monitoring the practice of automated content filtering and curation, and reporting to the EU institutions;
Amendment 271 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 2
Annex I – part A – part I – section 2 – indent 2
Amendment 272 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 2
Annex I – part A – part I – section 2 – indent 2
- regular review of the compliance of content hosting platforms with the Regulation and other provisions that form part of the Digital Services Act, in particular as regards the correct implementation of the standards for notice-and-action procedures and content moderation in their terms and conditions, on the basis of transparency reports provided by the content hosting platforms and the public database of decisions on removal of content to be established by the Digital Services Act;
Amendment 275 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 3 a (new)
Annex I – part A – part I – section 2 – indent 3 a (new)
- cooperate and coordinate with the national authorities of Member States related to the implementation of the Digital Services Act.
Amendment 277 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4
Annex I – part A – part I – section 2 – indent 4
Amendment 279 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – introductory part
Annex I – part A – part I – section 2 – indent 4 – introductory part
- imposing fines for non-compliance with the Digital Services Act. Fines should be set at up to 4% of the total worldwide annual turnover of the content hosting intermediary and take into account the platform’s overall compliance with the Digital Services Act. The fines should contribute to a special dedicated fund intended to finance the operating costs of the dispute settlement bodies described in the Regulation. Instances of non- compliance should include:reporting to the Commission detected non-compliance with the rules established by the Digital Services Act including publishing biannual reports on all of its activities.
Amendment 282 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 1
Annex I – part A – part I – section 2 – indent 4 – subi. 1
Amendment 283 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 1
Annex I – part A – part I – section 2 – indent 4 – subi. 1
Amendment 287 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 2
Annex I – part A – part I – section 2 – indent 4 – subi. 2
Amendment 289 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 2
Annex I – part A – part I – section 2 – indent 4 – subi. 2
Amendment 292 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 3
Annex I – part A – part I – section 2 – indent 4 – subi. 3
Amendment 294 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 3
Annex I – part A – part I – section 2 – indent 4 – subi. 3
Amendment 299 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 4
Annex I – part A – part I – section 2 – indent 4 – subi. 4
Amendment 301 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 4
Annex I – part A – part I – section 2 – indent 4 – subi. 4
Amendment 306 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 5
Annex I – part A – part I – section 2 – indent 4 – subi. 5
Amendment 307 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 5 a (new)
Annex I – part A – part I – section 2 – indent 4 – subi. 5 a (new)
- publication of studies with proposals for measures aimed at helping increase the competitiveness and growth of micro, small and medium-sized enterprises.
Amendment 309 #
Motion for a resolution
Annex I – part A – part I – section 3 –– introductory part
Annex I – part A – part I – section 3 –– introductory part
The Digital Services Act should contain provisions requiring content hosting platforms to regularly provide transparency reports to the AgencyCommission and the network of national authorities. Such reports should, in particular, include:
Amendment 311 #
Motion for a resolution
Annex I – part A – part I – section 3 –– introductory part
Annex I – part A – part I – section 3 –– introductory part
The Digital Services Act should contain provisions requiring content hosting platforms to regularly provide transparency reports to the Agencycommittee. Such reports should, in particular, include:
Amendment 320 #
Motion for a resolution
Annex I – part A – part II – section1 – introductory part
Annex I – part A – part II – section1 – introductory part
Measures regarding content curation, data and online advertisements, including political advertising to achieve politically motivated goals, in breach of fair contractual rights of users should include:
Amendment 321 #
Motion for a resolution
Annex I – part A – part II – section 1 – introductory part
Annex I – part A – part II – section 1 – introductory part
Measures regarding content curation, data and online advertisements in breach of fair contractual rights of users should include:
Amendment 322 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 1
Annex I – part A – part II – section 1 – indent 1
- Measures to limit the data collected by content hosting platforms, based on interactions of users with content hosted on content hosting platforms, for the purpose of completing targeted advertising profiles, in particular by imposing strict conditions for the use of targeted personal advertisements, while in the case of political advertising, the measures should be limited to the requirement of transparency in terms of clearly identifying political advertising, the possibility of identifying its sponsor and the entity in whose favour the advertisement was commissioned, and the obligation to indicate that it is political advertising should fall to its sponsor and be backed up by appropriate enforcement tools.
Amendment 326 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 1 a (new)
Annex I – part A – part II – section 1 – indent 1 a (new)
- In the context of political advertising, it would be appropriate to address the phenomenon of troll farms (also known as troll factories or web brigades) of anonymous commentators on political events who appear on social networks under a massive number of fake user profiles to manipulate public opinion, and to explore various options to combat this frequently cross-border interference in political competition, for instance promoting the concept of trusted personal profiles and possible synergies with efforts to build a European blockchain-based electronic identity verification service.
Amendment 335 #
Motion for a resolution
Annex I – part A – part II – section 2
Annex I – part A – part II – section 2
Amendment 337 #
Motion for a resolution
Annex I – part A – part II – section 2 – indent 1
Annex I – part A – part II – section 2 – indent 1
Amendment 347 #
Motion for a resolution
Annex I – part A – part II – section 3 – indent 1
Annex I – part A – part II – section 3 – indent 1
- measures ensuring that the proper legislative framework is in place for the development and deployment of digital services making use ofincluding distributed ledger technologies, including such as block chains, and in particular for smart contracts,
Amendment 348 #
Motion for a resolution
Annex I – part A – part II – section 3 – indent 2
Annex I – part A – part II – section 3 – indent 2
Amendment 350 #
Motion for a resolution
Annex I – part A – part II – section 3 – indent 2 a (new)
Annex I – part A – part II – section 3 – indent 2 a (new)
- measures to ensure equality between the parties in case of smart contracts, taking into account in particular the interest of small businesses and SMEs, for which the Commission should examine possible modalities.
Amendment 351 #
Motion for a resolution
Annex I – part A – part II – section 4
Annex I – part A – part II – section 4
Amendment 352 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 1
Annex I – part A – part II – section 4 – indent 1
Amendment 356 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 2
Annex I – part A – part II – section 4 – indent 2
Amendment 358 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 3
Annex I – part A – part II – section 4 – indent 3
Amendment 359 #
Motion for a resolution
Annex I – part A – part II – section 4– final part
Annex I – part A – part II – section 4– final part
Amendment 360 #
Motion for a resolution
Annex I – part B – recital 1
Annex I – part B – recital 1
(1) The terms and conditions that providers of information society servicedigital service providers apply in relations with users are often non- negotiable and can be unilaterally amended by those providers. Action at a legislative level is needed to put in place minimum standards for such terms and conditions, in particular as regards procedural standards for content management;
Amendment 368 #
Motion for a resolution
Annex I – part B – recital 5
Annex I – part B – recital 5
(5) Concerning relations with users, this Regulation should lay down minimum standards for the transparency and accountability of terms and conditions of content hosting platforms. Terms and conditions should include transparent, binding and uniform standards and procedures for content moderation, which should guarantee accessible and independent recourse to judicial redress.
Amendment 369 #
Motion for a resolution
Annex I – part B – recital 6
Annex I – part B – recital 6
Amendment 377 #
Motion for a resolution
Annex I – part B – recital 9
Annex I – part B – recital 9
Amendment 383 #
Motion for a resolution
Annex I – part B – recital 9 a (new)
Annex I – part B – recital 9 a (new)
(9a) This Regulation does not prevent platforms from using an automated content mechanism where necessary and justified, and in particular promotes the use of such mechanism in the case the illegal nature of the content has either been established by a court or it can be easily determined without contextualisation.
Amendment 384 #
Motion for a resolution
Annex I – part B – recital 10
Annex I – part B – recital 10
(10) This Regulation should also include provisions against discriminatory content moderation practices, especially when user-created content is removed based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social clasunjustified content moderation practices.
Amendment 386 #
Motion for a resolution
Annex I – part B – recital 11
Annex I – part B – recital 11
(11) The right to issue a notice pursuant to this Regulation should remain with any natural or legal person, including public bodies, to which content is provided through a website or application. A content hosting platform should, however, be able to block a user who repeatedly issues false notices from issuing further notices.
Amendment 388 #
Motion for a resolution
Annex I – part B – recital 12
Annex I – part B – recital 12
Amendment 391 #
Motion for a resolution
Annex I – part B – recital 14
Annex I – part B – recital 14
(14) Given the immediate nature of content hosting and the often ephemeral purpose of content uploading, it is necessary to establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse. Such bodies should be competent to adjudicate disputes concerning the legality of user-uploaded content and the correct application of terms and conditionsrecourse, however such process should not prevent the user right of access to justice.
Amendment 398 #
Motion for a resolution
Annex I – part B – recital 17
Annex I – part B – recital 17
(17) As regards jurisdiction, the competent independent dispute settlement body should be that located in the Member State in which the content forming the subject of the dispute has been uploadedCircumstances on the basis of which jurisdiction should be established must be in the interests of the users, so that both the place where the content has been uploaded and downloaded shall be deemed to constitute a ground of jurisdiction.
Amendment 399 #
Motion for a resolution
Annex I – part B – recital 18
Annex I – part B – recital 18
Amendment 401 #
Motion for a resolution
Annex I – part B – recital 20
Annex I – part B – recital 20
(20) Since the objective of this Regulation, namely to establish a regulatory framework for contractual rights as regards content management in the Union, cannot be sufficiently achieved by the Member States but can rather, by reason of its scale and effects can , be better achieved at Union level, the Union may adopt measures, in accordance with the principle of subsidiarity as set out in Article 5 of the Treaty on European Union. In accordance with the principle of proportionality, as set out in that Article, this Regulation does not go beyond what is necessary in order to achieve that objective.
Amendment 402 #
Motion for a resolution
Annex I – part B – recital 21
Annex I – part B – recital 21
Amendment 403 #
Motion for a resolution
Annex I – part B – recital 21
Annex I – part B – recital 21
(21) Action at Union level as set out in this Regulation would be substantially enhanced with the establishment of a Union agency tasked with monitoring and ensuring compliance by content hosting platforms with the provisions of this Regulation. The Agencycreation of an independent European expert committee to propose measures to increase transparency, competitiveness and the growth of micro, small and medium-sized enterprises, in cooperation with Member States, the private sector, European academia and civil society. The committee should review compliance with the standards laid down for content management on the basis of transparency reports and an audit of algorithms employed by content hosting platforms for the purpose of content management.
Amendment 407 #
Motion for a resolution
Annex I – part B – recital 21 a (new)
Annex I – part B – recital 21 a (new)
(21a) This regulation must be based on a thorough impact study of the intended initiative, which will clearly demonstrate the need for new rules for information society service providers. This impact study will need to actively involve industry stakeholders providing information society services, which will be most affected by the potential initiative, and whose practical day-to-day experience will be of the greatest value in the design and evaluation of the impact study itself, and thus in the assessment of the need for the aforementioned initiative.
Amendment 409 #
Motion for a resolution
Annex I – part B – Article 1 – paragraph 1
Annex I – part B – Article 1 – paragraph 1
The purpose of this Regulation is to contribute to the proper functioning of the internal market by laying down rules to ensure that fair contractual rights exist as regards content management andprovide digital services providers with a clear, uniform, and up-to-date innovation friendly regulatory framework in the Single Market, to provide independent dispute settlement mechanisms for disputes regarding content managementtect, enable, and empower users when accessing digital services and to ensure the necessary cooperation among Member States in order to have an oversight of digital service providers in the EU.
Amendment 411 #
Motion for a resolution
Annex I – part B – Article 2 – paragraph 1
Annex I – part B – Article 2 – paragraph 1
This Regulation applies to the management by content hosting platforms of content that isproviders offering digital service accessible on websites or through smart phone applications in the Union, irrespective of the place of establishment or registration, or principal place of business of the content hosting platform., in particular online platforms such as social media, search engines, online marketplaces or collaborative economy services
Amendment 415 #
Motion for a resolution
Annex I – part B – Article 3 –point 1
Annex I – part B – Article 3 –point 1
(1) ‘content hosting platform’ means an provider of information society service within the meaning of point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council1 of whichs consisting of the storage of information provided by the recipient of the service at his or her request, within the maeaning or one of the main purposes is to allow signed-up or non- signed-up users to upload content for display on a website or applicatf Article 14 of Directive 2000/31/EC ,irrespective of its place of establishment, which directs its activities to users residing in the Union; __________________ 1 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
Amendment 417 #
Motion for a resolution
Annex I – part B – Article 3 –point 2
Annex I – part B – Article 3 –point 2
(2) ‘'illegal content’' means any concept, idea, expression or information in any format such as text, images, audio and videoinformation which is not in compliance with Union law or the law of a Member State concerned;
Amendment 426 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 1
Annex I – part B – Article 4 – paragraph 1
1. Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall be appropriate, relevant and limiproportionated to what is necessary in relation to the purposes for which the content is managed.
Amendment 428 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2
Annex I – part B – Article 4 – paragraph 2
Amendment 434 #
Motion for a resolution
Annex I – part B – Article 4 a (new)
Annex I – part B – Article 4 a (new)
Amendment 437 #
Motion for a resolution
Annex I – part B – Article 4 b (new)
Annex I – part B – Article 4 b (new)
Article 4b Transparency obligation 1. Digital services actively hosting or moderating online content shall take the necessary measures in order to disclose the funding and the power of interest groups behind those using their services so that the person legally responsible and accountable should be identifiable. 2. Digital service providers without a permanent establishment in the EU shall designate a legal representative for user interest within the European Union and make the contact information of this representative visible and accessible on their websites.
Amendment 441 #
Motion for a resolution
Annex I – part B – Article 5 – subparagraph 2
Annex I – part B – Article 5 – subparagraph 2
Amendment 469 #
Motion for a resolution
Annex I – part B – Article 12 – title
Annex I – part B – Article 12 – title
Stay-updown principle
Amendment 470 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1
Annex I – part B – Article 12 – paragraph 1
Amendment 474 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1 a (new)
Annex I – part B – Article 12 – paragraph 1 a (new)
Digital service providers should act expeditiously to make unavailable or remove illegal content that has been notified to them and make best efforts to prevent future uploads of the same content.
Amendment 476 #
Motion for a resolution
Annex I – part B – Article 13 – paragraph 1
Annex I – part B – Article 13 – paragraph 1
1. Member States shallmay establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse when decisions on content moderation are appealed against.
Amendment 478 #
Motion for a resolution
Annex I – part B – Article 13 – paragraph 4
Annex I – part B – Article 13 – paragraph 4
Amendment 483 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 3
Annex I – part B – Article 14 – paragraph 3
Amendment 485 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 3 a (new)
Annex I – part B – Article 14 – paragraph 3 a (new)
3a. Both the place where the content has been uploaded and accessed shall be deemed to constitute a ground of jurisdiction
Amendment 488 #
Motion for a resolution
Annex I – part B – Article 17
Annex I – part B – Article 17
Amendments to Directive (EU) 2019/1937 Directive (EU) 2019/1937 is amended as follows: (1) following point is added: “(xi) online content management;”; (2) following point is added: “K. Point (a)(xi) of Article 2(1) - online content management. Regulation [XXX] of the European Parliament and of the Council on contractual rights as regards content management.”.rticle 17 deleted in point (a) of Article 2(1), the in Part I of the Annex, the