Activities of Emmanuel MAUREL related to 2020/2019(INL)
Shadow reports (1)
REPORT with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online
Amendments (36)
Amendment 15 #
Motion for a resolution
Recital A
Recital A
A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that balances central concerns like respect for fundamental rights and other rights of citizens, with the need to suppguarantees fundamental rights and other rights; at the same time, such guarantees must allow fort development and economic progress to be supported;
Amendment 37 #
Motion for a resolution
Recital E
Recital E
E. whereas content hosting platforms evolved from involving the mere display of content into sophisticated bodies and market players, in particular in the case of social networks that optimise content which harvests and exploits usage data; whereas users have reasonabllegitimate grounds to expectand substantive rights to demand fair terms for the usage of such platforms;
Amendment 41 #
Motion for a resolution
Recital F
Recital F
F. whereas content hosting platforms may determine what content is shown to their users, thereby profoundly influencing the way we obtain and communicate information, to the point that content hosting platforms have de facto become public spaces in the digital sphere; whereas public spaces must be managed in a manner that respects fundamental rights and the civil law rights of the users; whereas it is demonstrated that many illegal products and services are offered through online marketplaces, requiring action to be taken via the Digital Services Act; whereas the Digital Services Act must lay down clear obligations for online platforms and a special liability regime for online marketplaces;
Amendment 53 #
Motion for a resolution
Recital G
Recital G
G. whereas upholding the law in the digital world does not only involve effective enforcement of rights, but also, in particular, ensuring access to justice for all; whereas delegation of the taking of decisions regarding the legality of content or of law enforcement powers to private companies can undermines the right to a fair trial and riskdoes not to provide an effective remedy;
Amendment 84 #
Motion for a resolution
Paragraph 2
Paragraph 2
2. Proposes that the Digital Services Act include a regulation that establishes contractual rights as regards content management, lays down transparent, binding and uniform standards and procedures for content moderation, and guarantees accessible and independent recourse to judicial redressrapid and simple recourse to judicial redress; proposes that digital service providers provide access to high- quality alternative dispute resolution meeting the criteria set out in Directive 2013/11/EU (Directive on alternative dispute resolution for consumer disputes);
Amendment 102 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Considers that any final decision on the legality of user-generated contentopinion content, including on social media, must be made by an independent judiciary and not a private commercial entity;
Amendment 107 #
Motion for a resolution
Paragraph 3 a (new)
Paragraph 3 a (new)
3a. Considers that ‘online marketplace’ content hosting platforms should be considered active hosts. They must be legally responsible for their decisions on the legality of user-generated content;
Amendment 108 #
Motion for a resolution
Paragraph 3 b (new)
Paragraph 3 b (new)
3b. Considers that the notification and action system under the electronic commerce directive – which obliges commercial platforms to remove identified illegal content, after notification, including by the right-holders and an evaluation – must be strengthened by having a notification and take down system, so that illegal content already removed can no longer reappear on the platform;
Amendment 126 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Suggests that content hosting platforms regularly submit transparency reports to the European Agency, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; calls for the publication of comprehensive transparency reports, based on a consistent methodology and assessed on the basis of relevant performance indicators; further suggests that content hosting platforms publish their reasoned decisions on removing user-generated content on a publicly accessible database;;
Amendment 137 #
Motion for a resolution
Paragraph 7
Paragraph 7
7. Recommends the appointment or establishment of independent dispute settlementnational bodies tasked with settling disputes regarding content moderation not settled in advance through the internal procedures of the content hosting platforms. In addition, the various parties can have direct recourse to justice;
Amendment 143 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing ‘online marketplace’ content hosting platforms to employ any form of fully automated ex-ante controls of contentreasonable and proportionate proactive measures to prevent illegal content being published on their platforms, and considers that any suchthe mechanism voluntarilys employed by platforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Act and be subject to effective redress mechanisms;
Amendment 176 #
Motion for a resolution
Paragraph 12 a (new)
Paragraph 12 a (new)
12a. Calls on the Commission to lay down rules to ensure effective data interoperability in order to make content purchased on a platform accessible on any digital tool irrespective of the make;
Amendment 178 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Calls for content hosting platforms to give users thea real choice of whether to consent to the use of targeted advertising based on the user’s prior interaction with content on the same content hosting platform or on third party websites; that possibility must be drafted in a way which is understandable and its refusal must not lead to access to the content being refused. Considers that content aimed at children should be subject to stricter rules;
Amendment 195 #
Motion for a resolution
Paragraph 16
Paragraph 16
16. Regrets the existing information asymmetry between content hosting platforms and public authorities and calls for a compulsory and streamlined exchange of necessary information;
Amendment 209 #
Motion for a resolution
Paragraph 18 a (new)
Paragraph 18 a (new)
18a. Calls on the Commission to require ‘online marketplace’ content hosting platforms to prohibit non- identifiable content publishers. They must be able to identify the natural or legal persons who publish on their platform;
Amendment 210 #
Motion for a resolution
Paragraph 18 b (new)
Paragraph 18 b (new)
18b. Calls on the Commission to require ‘online marketplace’ content hosting platforms to close the accounts of users who repeatedly publish illegal content and to take the necessary steps to ensure that such illegal content does not reappear on their platform;
Amendment 211 #
Motion for a resolution
Paragraph 18 c (new)
Paragraph 18 c (new)
18c. Calls on the Commission to prohibit access to the EU market for ‘online marketplace’ content hosting platforms which: - are unable to identify their users; - do not take all necessary measures to take down illegal content; - do not close the accounts of users who repeatedly publish illegal content;
Amendment 226 #
Motion for a resolution
Paragraph 22 a (new)
Paragraph 22 a (new)
22a. Provisions on the safety of products sold online
Amendment 227 #
Motion for a resolution
Paragraph 22 b (new)
Paragraph 22 b (new)
22b. Stresses that products bought through online marketplaces should comply with all the relevant EU safety regulations, given that the Digital Services Act should be able to upgrade the liability and safety rules for digital platforms, services and products;
Amendment 228 #
Motion for a resolution
Paragraph 22 c (new)
Paragraph 22 c (new)
22c. Strongly believes that there is a need to strengthen platform liability for illegal and unsafe products, thus re- enforcing the digital single market; recalls that in such cases platform liability should be fit for purpose, taking into account the consumer safeguards in place, which should be complied with at all times, and the establishment of concomitant redress measures for retailers and consumers; believes that the system can only function if enforcement authorities have sufficient powers, tools and resources to enforce the provisions and cooperate effectively in cases with a transnational element;
Amendment 229 #
Motion for a resolution
Paragraph 22 d (new)
Paragraph 22 d (new)
22d. Stresses that in view of the commercial activities in online marketplaces, self-regulation has proven to be insufficient and calls, therefore, on the Commission to introduce strong safeguards and obligations with respect to product safety and consumer protection for commercial activities in online marketplaces, accompanied by a tailored liability regime with appropriate enforcement mechanisms;
Amendment 233 #
Motion for a resolution
Annex I – part A – introductory part – indent 5
Annex I – part A – introductory part – indent 5
- The proposal raises the question regarding aspects of data collection in contravention of fair contractual rights of users, data protection and online confidentiality rules.
Amendment 251 #
Motion for a resolution
Annex I – part A – part I – section 1 – indent 3
Annex I – part A – part I – section 1 – indent 3
- It should provide formal and procedural standards for a notice and actionremoval system.
Amendment 253 #
Motion for a resolution
Annex I – part A – part I – section 1 – indent 4
Annex I – part A – part I – section 1 – indent 4
- It should provide for an independent dispute settlement mechanism. in accordance with the quality criteria laid down in Directive 2013/11/EU (Directive on alternative dispute resolution for consumer disputes), without limiting access to judicial remedy;
Amendment 256 #
Motion for a resolution
Annex I – part A – part I – section 1 – indent 5
Annex I – part A – part I – section 1 – indent 5
- It should fully respect Union rules protecting personal data as well as fundamental rights and all applicable legislation.
Amendment 274 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 3
Annex I – part A – part I – section 2 – indent 3
- working with content hosting platforms on best practices to meet the transparency and accountability requirements for terms and conditions, as well as best practices in content moderation and implementing notice-and- actionremoval procedures;
Amendment 280 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – introductory part
Annex I – part A – part I – section 2 – indent 4 – introductory part
- imposing fines for non-compliance with the Digital Services Act. Fines should be set at up to 4 % of the total worldwide annual turnover of the content hosting intermediary and take into account the platform’s overall compliance with the Digital Services Act. In the event of repeated infringement, the platform concerned may be banned from the European market. The fines should contribute to a special dedicated fund intended to finance the operating costs of the dispute settlement bodies described in the Regulation. Instances of non- compliance should include:
Amendment 284 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 1
Annex I – part A – part I – section 2 – indent 4 – subi. 1
- failure to implement the notice- and-actionremoval system provided for in the Regulation;
Amendment 316 #
Motion for a resolution
Annex I – part A – part I – section 3 – indent 1 – subi. 8
Annex I – part A – part I – section 3 – indent 1 – subi. 8
- information on the enforcement of terms and conditions and information on court decisions ordering the annulment and/or modification of terms and conditions of use considered unfair or illegal by an EU country.
Amendment 341 #
Motion for a resolution
Annex I – part A – part II – section 2 – indent 1 a (new)
Annex I – part A – part II – section 2 – indent 1 a (new)
- the ban on imposing a locked proprietary ecosystem for the use of digital products. In order to allow genuine interoperability of data, digital products must be in open format so as to allow users to export to different digital environments.
Amendment 354 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 1
Annex I – part A – part II – section 4 – indent 1
- include the effective enforcement of existing measures ensuring that non- negotiable terms and conditions do not include provisions regulating private international law matters to the detriment of access to justice,
Amendment 380 #
Motion for a resolution
Annex I – part B – recital 9
Annex I – part B – recital 9
(9) This Regulation should not contain provisions forcing passive content hosting platforms to employ any form of fully automated ex-ante control of content.
Amendment 393 #
Motion for a resolution
Annex I – part B – recital 15
Annex I – part B – recital 15
(15) In order to ensure that users and notifiers toUsers and notifiers may, as a first step, make use of referral to independent dispute settlement bodies as a first step, i. It must be emphasised that such referral should not preclude any subsequent court action. Given that content hosting platforms which enjoy a dominant position on the market can particularly gain from the introduction of independent dispute settlement bodies, it is appropriate that they take responsibility for the financing of such bodies by means of a levy on their turnover, and that the functioning of such bodies is entirely independent of these structures.
Amendment 419 #
Motion for a resolution
Annex I – part B – Article 3 –point 4
Annex I – part B – Article 3 –point 4
(4) ‘content moderation’ means the practice of monitoring and applying a pre- determined set of rules and guidelines to user-generatpublished content in order to ensure that the content complies with legal and regulatory requirements, community guidelines and terms and conditions, as well as any resulting measure taken by the platform, such as removal of the content or the deletion or suspension of the user’s account, be it through automated means or human operators;
Amendment 433 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 4 a (new)
Annex I – part B – Article 4 – paragraph 4 a (new)
4a. The accounts of users who repeatedly publish illegal content must be closed.
Amendment 456 #
Motion for a resolution
Annex I – part B – Article 9 – point 2
Annex I – part B – Article 9 – point 2
2. Following a notice, content hosting platforms shall decide to remove, take down or make invisible content that was the subject of a notice, if such content does not comply with legal and regulatory requirements, community guidelines or terms and conditions. Content hosting platforms must ensure that such content cannot reappear on their platform.