124 Amendments of Liesje SCHREINEMACHER related to 2020/2018(INL)
Amendment 2 #
Motion for a resolution
Citation 2 a (new)
Citation 2 a (new)
- having regard to the communication from the Commission of 11 January 2012, entitled “A coherent framework for building trust in the Digital Single Market for e-commerce and online services” COM/2011/0942 final,
Amendment 5 #
Motion for a resolution
Citation 2 b (new)
Citation 2 b (new)
- having regard to the Memorandum of Understanding on the sale of counterfeit goods via the internet of 21 June 2016 and its review in the Communication from the Commission to the European Parliament, the Council and the European Economic and Social Committee of 29 November 2017, entitled “ A balanced IP enforcement system responding to today's societal challenges” (COM (2017) 707) final,
Amendment 6 #
Motion for a resolution
Citation 3 a (new)
Citation 3 a (new)
- having regard to the Communication from the Commission of 28 September 2017, entitled “Tackling Illegal Content Online: Towards an enhanced responsibility of online platforms” (COM (2017) 555), and its Recommendation of 1 March 2018 on measures to effectively tackle illegal content online (COM (2018) 1177),
Amendment 7 #
Draft opinion
Paragraph 1
Paragraph 1
Amendment 14 #
Draft opinion
Paragraph 1 a (new)
Paragraph 1 a (new)
1a. The upcoming legislative proposal on the Digital Services Act should fully respect the Charter of Fundamental Rights of the European Union, as well as Union rules protecting consumers and their safety, privacy and personal data, as well as other fundamental rights;
Amendment 17 #
Draft opinion
Paragraph 1 b (new)
Paragraph 1 b (new)
1b. Recalls the importance of the key principles of the e-Commerce Directive, namely the country of origin principle, the limited liability clause and the ban on general monitoring obligation, to remain valid in the legislative proposal on the Digital Services Act;
Amendment 18 #
Draft opinion
Paragraph 1 c (new)
Paragraph 1 c (new)
1c. Stresses the need for a definition of ‘dominant platforms’ and lay down their characteristics;
Amendment 22 #
Draft opinion
Paragraph 2
Paragraph 2
2. NStresses the enforcement of existing Regulation (EU) 2016/679 on the protection of natural persons with regards to the processing of personal data and on the free movement of such data and notes that since the online activities of individuals allow for deep insights into their personality and make it possible to manipulate them, the collection and use of personal data concerning the use of digital services should be subjected to a specific privacy framework and limited to the extent necessary to provide and bill the use of the servicethis framework;
Amendment 23 #
Motion for a resolution
Recital B
Recital B
B. whereas the Directive 2000/31/EC of the European Parliament and of the Council2 (“the E-Commerce Directive”) has been one of the most successful pieces of Union legislation and has shaped the Digital Single Market as we know it today; whereas the E-Commerce Directive was adopted 20 years ago and no longer adequately reflects the rapid transformation and expansion of e- commerce in all its forms, with its multitude of different emerging services, providers and challengeswhereas since its adoption 20 years ago, the European Court of Justice has issued a number of judgments in relation to it; whereas the clarifications made by the European Court of Justice should be codified; __________________ 2 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
Amendment 33 #
Motion for a resolution
Recital C
Recital C
C. whereas, despite the clarifications made by the European Court of Justice, the need to go beyond the existing regulatory framework is clearly demonstrated by the fragmented approach of Member States to tackling illegal content online, by the lack of enforcement and cooperation between Member State, currently Member States have fragmented approach to tackling illegal content online as, since the entry into force of Directive 2000/31/EC, some Member States have adopted their own rules on 'notice-and-action' mechanisms; whereas there are therefore increasing differences between such national rules; whereas, as a consequence, the service providers concerned cand by the inability of the existing legal framework to promote effective market entrye subject to a range of different legal requirements which are diverging as to their content and sconsumer welfarpe;
Amendment 35 #
Motion for a resolution
Recital C a (new)
Recital C a (new)
Ca. whereas a recent Parliament 1a study shows that the potential gain of completing the Digital Single Market for services could be up to €100 billion; whereas the Digital Services Act should not only be a way to regulate those services but should also aim at unlocking this potential to the benefit of the European economy; __________________ 1a“Europe’s two trillion euro dividend, Mapping the Cost of Non-Europe 2019- 2024”, EPRS, PE 631.745, April 2019
Amendment 35 #
Draft opinion
Paragraph 3
Paragraph 3
3. Notes that automated tools are unable to differentiate illegal content from content that is legal in a given context; highlights that human review of automated reports by service providers does not solve this problem as private staff lack the independence, qualification and accountability of public authorities; stresses, therefore, that the Digital Services Act should explicitly prohibitregulate any obligation on hosting service providers or other technical intermediaries to use automated tools for content moderation, and refrain from imposing notice-and-stay- down mechanisms; insists that content moderation procedures used by providers should not lead to any ex-ante control measures based on automated tools or upload-filtering of content;
Amendment 38 #
Motion for a resolution
Recital C b (new)
Recital C b (new)
Cb. whereas the E-Commerce Directive provides the foundations for the Digital Single Market by setting out the country of origin principle, forbidding any form of prior authorisation, establishing a limited liability regime and a ban on a general monitoring obligation, and great care must be taken to not alter these principles if the Commission decides to propose to amend, widen, or limit this Directive;
Amendment 39 #
Motion for a resolution
Recital D
Recital D
D. whereas the social and economic challenges brought by the COVID-19 pandemic are showing the resilience of the e-commerce sector and its potential as a driver for relaunching the European economy; whereas, at the same time, the pandemic has also exposed serious shortcomings of the current regu the Commission contacted a number of platforms, social media, search engines and market places rapidly to require their cooperation in taking down scams from their platforms; whereas platfory framework which call for action at Union level to address the difficulties identified and to prevent them from happening in the futurems replied positively to this call for cooperation and since then a rapid and efficient information exchange is in place; whereas, at the same time, the pandemic has also shown that platforms and online intermediation services need to step up their efforts to rapidly detect and take down fake claims and tackling the misleading practices of rogue traders in a consistent and coordinated manner, in particular of those selling false medical equipment online; whereas this calls for action at Union level to have a more coherent and coordinated approach to combat these misleading practices;
Amendment 45 #
Draft opinion
Paragraph 4
Paragraph 4
4. Stresses that the responsibility for enforcing the law, deciding on the legality of online activities and ordering hosting service providers to remove or disable access to illegal content as soon as possible should rest with independent, and after the provider and involved parties have been informed, should rest with independent dispute settlement bodies with possible appeal to judicial authorities; considers that only a hosting service provider that has actual knowledge of the existence of illegal content and its illegal nature should be subject to content removal obligations;
Amendment 46 #
Motion for a resolution
Recital D a (new)
Recital D a (new)
Da. whereas scandals recently emerged regarding data harvesting and selling, Cambridge Analytica, fake news, political advertising and manipulation and a host of other online harms (from hate speech to the broadcast of terrorism);
Amendment 48 #
Motion for a resolution
Recital D b (new)
Recital D b (new)
Db. whereas Directive (EU) 2019/770, Directive (EU) 2019/771, and Directive (EU) 2019/2161 were all adopted less than a year ago and are still in the process of being implemented and transposed into national legislation;
Amendment 49 #
Motion for a resolution
Recital D c (new)
Recital D c (new)
Dc. whereas Regulation (EU) 2019/1150 on promoting fairness and transparency for business users of online intermediation services only came into force in July 2019 and is only binding on platforms from 12 July 2020;
Amendment 50 #
Motion for a resolution
Recital D d (new)
Recital D d (new)
Dd. whereas the COVID-19 pandemic has shown how vulnerable EU consumers are to misleading trading practices by dishonest traders selling illegal products online that are not compliant with Union safety rules or imposing unjustified and abusive price increases or other unfair conditions on consumers; whereas this problem is aggravated by the fact that often the identity of these companies cannot be established;
Amendment 55 #
Draft opinion
Paragraph 5
Paragraph 5
Amendment 62 #
Motion for a resolution
Paragraph 1
Paragraph 1
1. Welcomes the Commission’s commitment to submit a proposal for a Digital Services Act package, and, on the basis of Article 225 of the Treaty on the Functioning of the European Union (TFEU), calls on the Commission to submit such a package on the basis of the relevant Articles of the Treaties, following the recommendations set out in the Annex hereto;
Amendment 62 #
5a. Stresses the need of the enforcement of existing measures to limit the data collected by content hosting platforms, based on inter alia interactions of users with content hosted on content hosting platforms, for the purpose of completing targeted advertising profiles, in particular by imposing strict conditions for the use of targeted personal advertisements;
Amendment 63 #
Draft opinion
Paragraph 5 b (new)
Paragraph 5 b (new)
5b. Insists that the regulation must prohibit content moderation practices that are discriminatory;
Amendment 64 #
Draft opinion
Paragraph 5 c (new)
Paragraph 5 c (new)
Amendment 81 #
Motion for a resolution
Paragraph 2 a (new)
Paragraph 2 a (new)
2a. Stresses that Commission should, ahead of a possible revision of the E- Commerce Directive, complete a full public consultation, including an in person stakeholder hearing, and a full impact assessment, take into account the lessons learned from the COVID-19 crisis and from the European Parliament resolutions; similarly, stresses that this must also apply to other potential pieces of the Digital Services Act package;
Amendment 84 #
Motion for a resolution
Paragraph 2 b (new)
Paragraph 2 b (new)
2b. Underlines that, if a revision is approved by the co-legislators, that implementation of the final adopted legislations should be supported by the adoption of Vademecums and implementation guidelines;
Amendment 120 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Takes the view that a level playing field in the internal market between the platform economy and the "traditional" offline economy, based on the same rights and obligations for all interested parties - consumers and businesses - is needed; considers that social protection and social rights of workers, especiallythe Digital Single Act should not tackle the issue of platform wor collaborative economy workers should be properly addressed in a specific instrument, accompanying the future regukers; notes that a report is being prepared by the relevant committee of the European Parliament on “Fair working conditions, rights and social protection for platfory frameworkm workers - New forms of employment linked to digital development”;
Amendment 127 #
Motion for a resolution
Paragraph 5 a (new)
Paragraph 5 a (new)
5a. Calls on the Commission to focus its work and to ensure that any legislation is targeted and limited; encourages the Commission to refrain from any attempt to cover all long standing Digital Single Market issues within a single package; underlines that the previous Commission already had an extensive digital agenda and that there is a need to assess its effect before regulating again on the same issue; underlines in particular that Directive (EU) 2019/770 and Directive (EU) 2019/771 are still to be properly transposed and implemented; asks the Commission to take this into account before taking additional measures;
Amendment 131 #
Motion for a resolution
Paragraph 5 b (new)
Paragraph 5 b (new)
Amendment 147 #
Motion for a resolution
Paragraph 6 c (new)
Paragraph 6 c (new)
6c. Stresses that any future legislative proposals should seek to remove current, and prevent potentially new barriers in the supply of digital services by online platforms; underlines, at the same time, that new Union obligations on platforms must be proportional and clear in nature in order to avoid unnecessary regulatory burdens or unnecessary restrictions; underlines the need to prevent gold- plating practices of Union legislation by Member States;
Amendment 148 #
Motion for a resolution
Paragraph 6 d (new)
Paragraph 6 d (new)
6d. Recalls that the E-Commerce Directive was drafted in a technologically neutral manner in order to avoid amendments of the legal framework arising from the fast pace of innovation in the IT sector; asks the Commission to ensure that any revisions continue to respect this technologically neutral manner;
Amendment 152 #
Motion for a resolution
Paragraph 7 a (new)
Paragraph 7 a (new)
7a. Believes that the principles that governed the legislators when regulating information society services providers in the late 90’s are still valid and should be used when drafting any future proposals, namely: (a) To provide appropriate information on a wide scale (b) To prevent the creation of fresh obstacles and the re-fragmentation of the internal market (c) To reduce disputes to a minimum (d) To avoid the risks of over-regulation (e) To protect general interests more effectively and to identify any need for rules quickly (f) To step up administrative cooperation (g) To strengthen Union participation in international discussions;
Amendment 175 #
Motion for a resolution
Paragraph 9
Paragraph 9
Amendment 190 #
Motion for a resolution
Paragraph 10
Paragraph 10
10. Stresses that the Digital Services Act should achieve the right balance between the internal market freedoms and the fundamental rights and principlestrengthen the internal market for services while protecting rights set out in the Charter of Fundamental Rights of the European Union, in particular freedom of expression;
Amendment 195 #
Motion for a resolution
Paragraph 11
Paragraph 11
Amendment 203 #
Motion for a resolution
Paragraph 12
Paragraph 12
Amendment 216 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Considers that the current transparency and information requirements set out in the E-Commerce Directive on information society services providers and their business customers, that provide services to consumers (B2B2C) and the minimum information requirements on commercial communications, should be substantially strengthened;
Amendment 231 #
Motion for a resolution
Paragraph 14
Paragraph 14
14. Calls on the Commission to require intermediate service providers to verifycollect the information and identity of the business partners with whom they have a contractual commercial relationship, and to ensure that the information they provide is accurate and up-to-date when those business partners have a direct relationship with consumers through the intermediate service, and to ensure that the information is updated in case competent authorities informed the providers of any inaccuracy;
Amendment 239 #
Motion for a resolution
Paragraph 15
Paragraph 15
15. Calls on the Commission to introduce enforceable obligations on internet service providers aimed at increasing transparency and information, if proposing measures on internet service providers aimed at increasing transparency and information, to take into account the difference between the underlining hosting internet service provider on the one hand and a platform or other websites and its users on the other; stresses that internet service providers often have no contractual relations with a platform’s business users or consumers, including having no legal right to view or access data stored; asks the Commission to ensure that enforcement measures are targeted in a way that takes this difference into account and does not force the breach of privacy and legal process; considers that these obligations should be proportionate and enforced by appropriate, effective and dissuasive penalties;
Amendment 250 #
Motion for a resolution
Paragraph 15 a (new)
Paragraph 15 a (new)
15a. Underlines the need for due process; stresses the need to prevent the abuse of transparency, redress and other systems by businesses in order to confront other businesses; believes that any revisions must seek to balance the rights of all users and ensure that the law is not drafted to favour one legitimate interest over another;
Amendment 272 #
Motion for a resolution
Paragraph 17
Paragraph 17
17. Believes that while AI-driven services, currently governed by the E- commerce Directive, have enormous potential to deliver benefits to consumers and service providers, the new Digital Services Act should also address the concrete challenges not already covered by current legislation that they present in terms of ensuring non-discrimination, transparency and explainabilityon the data sets and the explainability - to the extent possible - of algorithms, as well as liability; points out the need to monitor algorithms and to assess associated risks, to use high quality and unbiased datasets, as well as to help individuals acquire access to diverse content, opinions, high quality products and services;
Amendment 281 #
Motion for a resolution
Paragraph 17 a (new)
Paragraph 17 a (new)
17a. Recalls that the protection of personal data subject to automated decision-making processes is already covered, among others, by the General Data Protection Regulation and none of the proposals should seek to repeat or amend such measures;
Amendment 282 #
Motion for a resolution
Paragraph 17 b (new)
Paragraph 17 b (new)
17b. Underlines that algorithms can be protected as trade secrets within the meaning of the Directive 2016/943; stresses that any supervision of such algorithms, where needed, must be carried out by the national regulatory authority of the country of origin, on a case by case basis, only when a Member State has reason to believe that it has algorithmic bias, and be subject to clear confidentiality rules;
Amendment 283 #
Motion for a resolution
Paragraph 17 c (new)
Paragraph 17 c (new)
17c. Believes that the focus of the Commission should be on potential bias within datasets or in the output, rather than on the algorithms themselves;
Amendment 288 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Considers that consumers shouldusers have the right to be properly informed and their rights should be effectively guaranteed when they interact with automated decision-making systems and other innovative digital services or applications; further considers that users should be informed when a service is personalised to its users and whether the personalisation can be switched off or otherwise limited; believes that it should be possible for consumusers to request checks and corrections of possible mistakes resulting from automated decisions, as well as to seek redress for any damage related to the use of automated decision-making systems;
Amendment 297 #
Motion for a resolution
Paragraph 18 a (new)
Paragraph 18 a (new)
18a. Stresses that digital services should not exclusively use automated decision-making systems for consumer support;
Amendment 303 #
Motion for a resolution
Paragraph 19
Paragraph 19
19. Stresses that the existence and spread of illegal content online is a severe threat that, such as incitement to terrorism, illegal hate speech, or child sexual abuse material, as well as infringements of intellectual property rights and consumer protection online undermines citizens' trust and confidence in the digital environment, and which also harms the economic development ofharms healthy platform ecosystems in the Digital Single Market and severely hampers the development of legitimate markets for digital services;
Amendment 310 #
Motion for a resolution
Paragraph 19 a (new)
Paragraph 19 a (new)
19a. Believes that allowing new innovative business models to flourish and strengthening the Digital Single Market by removing barriers to the free movement of digital content, barriers which creates national fragmented markets and a demand for illegal content, have been proven to work in the past, especially in relation to the infringements of intellectual property rights;
Amendment 319 #
Motion for a resolution
Paragraph 20
Paragraph 20
20. Notes that there is no ‘one size fits all’ solution to all types of Stresses the need to distinguish between ‘illegal and ’, ‘harmful content and cases of misinformation online; believes, however, that a more aligned approach at Union level, taking into account the different types of content, will make’, and other content; notes that some content linked to religious belief or political positions, for instance, might be considered harmful without being illegal; considers that 'harmful' legal content should not be regulated or defined in the fDight against illegal content more effectiveital Service Act as they are protected by the freedom of expression;
Amendment 328 #
Motion for a resolution
Paragraph 20 a (new)
Paragraph 20 a (new)
20a. Stresses also that content that might be seen as 'illegal' in some Member States, may not be seen as such in others as only some type of 'illegal' content are harmonised in the EU; notes that there is therefore no ‘one size fits all’ solution to all types of 'illegal' content;
Amendment 329 #
Motion for a resolution
Paragraph 20 b (new)
Paragraph 20 b (new)
20b. Believes, however, that a more aligned approach at Union level, taking into account the different types of content and online platforms and based on cooperation and exchange of best practices, will make the fight against 'illegal' content more effective;
Amendment 330 #
Motion for a resolution
Paragraph 20 c (new)
Paragraph 20 c (new)
20c. Underlines the need to adapt the severity of the measures that need to be taken by service providers to the seriousness of the infringement, so that the fight against terrorism, illegal hate speech, or child sexual abuse material take clear precedence over other types of infringements;
Amendment 340 #
Motion for a resolution
Paragraph 21
Paragraph 21
21. Considers that voluntary actions and self-regulation by online platforms across Europe have brought some benefits, but and additional measures are needed in ordershould be taken to ensure the swift detection and removal of illegal content online;
Amendment 342 #
Motion for a resolution
Paragraph 21 a (new)
Paragraph 21 a (new)
21a. Would welcome the adoption of measures which would allow online intermediaries to do further self-controls of content on their sites without fear of increased liability under the E-Commerce Directive; at the same time, disagrees with any measures which would require self- controls in order to qualify for limited liability protections;
Amendment 345 #
Motion for a resolution
Paragraph 21 b (new)
Paragraph 21 b (new)
21b. Underlines, however, the need to prevent a general monitoring of content uploads and for a light-handed approach by online intermediaries as to user uploaded content of a non-commercial nature; underlines that algorithms are not able to fully understand context and the legal uses of content as outlined in EU and different national legislations; believes that filters based on algorithms alone systematically lead to the removal of legitimate content (‘false positives’) and the corruption of such systems to the benefit of unfair commercial practices; asks where there is a doubt as to a content being of an 'illegal' nature, that this content should not be removed before further investigation;
Amendment 347 #
Motion for a resolution
Paragraph 21 c (new)
Paragraph 21 c (new)
21c. Asks the Commission to issue a study on the removal of content and data during the COVID-19 crisis by automated decision-making and the level of removals in error (false positives) that were included in the number of items removed;
Amendment 364 #
Motion for a resolution
Paragraph 22 a (new)
Paragraph 22 a (new)
22a. Stresses that such a ‘notice-and- action’ mechanism must be human- centric and give the benefit of the doubt to users; underlines that safeguards against the abuse of the system should be introduced, including against repeated false flagging, unfair commercial practices and other schemes; underlines that for many small traders, the removal of even a single product can result in the collapse of a business;
Amendment 368 #
Motion for a resolution
Paragraph 22 b (new)
Paragraph 22 b (new)
22b. Notes the challenges around the enforcement of legal injunctions issued within Member States other than the country of origin of a service provider; stresses the need to investigate this issue outside the scope of the Digital Service Act and any ‘notice-and-action’ mechanism;
Amendment 371 #
Motion for a resolution
Paragraph 23
Paragraph 23
23. Stresses that maintaining safeguards from the legal liability regime for hosting intermediaries with regard to user-uploaded content and the general monitoring prohibition set out in Article 15 of the E-Commerce Directive are still relevant and need to be preserved; in this context, underlines that the legal liability regime and ban on general monitoring should not be weakened via a possible new legislation or the amendment of other sections of the E-commerce Directive, including the amendment of the definitions laid down in the Directive;
Amendment 386 #
Motion for a resolution
Paragraph 23 a (new)
Paragraph 23 a (new)
23a. Asks the Commission to review the Annex to the E-Commerce Directive and, where relevant, remove or further limit the derogations granted there; notes that a significant and ever increasing part of the Digital Single Market is made up of services included there within;
Amendment 390 #
Motion for a resolution
Paragraph 23 b (new)
Paragraph 23 b (new)
23b. Notes that online intermediaries might encrypt or otherwise prevent outside access to their content by third parties, including hosting intermediaries, who do not have the encryption key; believes therefore that any requirements should take this and similar practical problems into account;
Amendment 397 #
Motion for a resolution
Paragraph 24
Paragraph 24
24. Notes that while online platforms, such as online market places, have benefited both retailers and consumers by improving choice and lowering prices, at the same time, they have allowed sellers, in particular from third countries, to offer products which often do not comply with Union rules on product safety and do not sufficiently guarantee consumer rightsan increasing number of non-compliant sellers - especially from third countries – are offering unsafe or illegal products in the European market;
Amendment 421 #
Motion for a resolution
Paragraph 26 a (new)
Paragraph 26 a (new)
26a. Asks the Commission to act at global level for minimum requirements for business information disclosure when trading online with consumers, the promotion of good practice via the development of new guidelines and the use of existing standards and the creation of a network of consumer centres to help European consumers to handle disputes with traders based in non-EU countries;
Amendment 426 #
Motion for a resolution
Paragraph 26 b (new)
Paragraph 26 b (new)
26b. Notes the continued issues of the abuse or wrong application of selective distribution agreements to limit the availability of products and services across borders within the Single Market and between platforms; asks the Commission to act on this issue within any wider review of Vertical Bloc Exemptions and other policies under Article 101 TFEU while refraining from its inclusion in the Digital Services Act;
Amendment 428 #
Motion for a resolution
Paragraph 26 c (new)
Paragraph 26 c (new)
26c. Treatment of contracts [NEW SECTION TITLE]
Amendment 429 #
Motion for a resolution
Paragraph 26 d (new)
Paragraph 26 d (new)
26 d. Asks the Commission to review all notifications under Article 9, paragraph 3 of the E-Commerce Directive and, where the Commission believes they are no longer merited, to require Member States to remove such requirements; asks, moreover, that this review take part every two years instead of five;
Amendment 430 #
Motion for a resolution
Paragraph 26 e (new)
Paragraph 26 e (new)
26 e. Notes the rise of “smart contracts” based on distributed ledger technologies; asks the Commission to analyse if certain aspects of “smart contracts” should be clarified and if guidance should be given in order to ensure legal certainty for businesses and consumers; asks especially for the Commission to work to ensure that such contracts with consumers are valid and binding throughout the Union, that they meet the standards of consumer law, for example the right of withdrawal under Directive 2011/83/EU, and that they are not subject to national barriers to application, such as notarisation requirements;
Amendment 431 #
Motion for a resolution
Paragraph 26 f (new)
Paragraph 26 f (new)
26 f. Asks the Commission, while recalling earlier efforts, to further review the practice of End User Licensing Agreements (EULAs) and Terms and Conditions Agreements (T&Cs) and to seek ways to allow greater and easier engagement for consumers, including in the choice of clauses; notes that EULAs and T&Cs are often accepted by users without reading them; notes, moreover, that when a EULA and T&Cs does allow for users to opt-out of clauses, service providers may require users to do so at each use, often in bad faith, to encourage acceptance;
Amendment 460 #
Motion for a resolution
Paragraph 28 a (new)
Paragraph 28 a (new)
28a. Underlines that additional ex-ante regulation on small and medium-sized enterprises should be avoided wherever possible and that additional requirements on systemic platforms should not lead to additional requirements for those businesses that use them;
Amendment 478 #
Motion for a resolution
Paragraph 30
Paragraph 30
30. Considers that a central regulatory authority should be established which should be responsible for the oversight and compliance with the Digital Services Act and have supplementary powers to tackle cross-border issues; it should be entrusted with investigation and enforcement powAsks the Member States to strengthen national regulatory authorities with the financial means and staff to allow for full oversight of online intermediaries established within their territories; believes that the Commission, through the Joint Research Centre, should be empowered to provide expert assistance to the Member States, upon request, towards the analysis of technological, administrative, or other matters in relation to the Digital Single Market legislative enforcement; encourages the Member States to pool and share best practices between national regulators, and to grant regulators legal authority to communicate between themselves in a secure manners;
Amendment 497 #
Motion for a resolution
Paragraph 32
Paragraph 32
32. Calls on the Commission to strengthen and modernise the current provisions on out-of-court settlement and court actions to allow for an effective enforcement and consumer redressconsumer redress; underlines that such measures should seek to support consumers that do not have the financial or legal means to use the court system and should not weaken the legal protections of small businesses and traders that national legal systems provide;
Amendment 502 #
Motion for a resolution
Paragraph 32 a (new)
Paragraph 32 a (new)
32a. Calls on national regulators and the Commission to provide further advice and assistance to EUSMEs about their rights;
Amendment 512 #
Motion for a resolution
Annex I – part I – paragraph 2
Annex I – part I – paragraph 2
The Digital Services Act should guarantee that online and offline economic activities are treated equally and on a level playing field which fully reflects the principle that “what is illegal offline is also illegal online” and equally “what is legal offline is also legal online”;
Amendment 522 #
Motion for a resolution
Annex I – part I – paragraph 5
Annex I – part I – paragraph 5
The Digital Services Act should build upon the rules currently applicable to online platforms, namely the E-Commerce Directive and the Platform to Business Regulation1 while refraining from proposing measures that were rejected by the co-legislators during its negotiation. __________________ 1 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57).
Amendment 529 #
Motion for a resolution
Annex I – part I – paragraph 6 – indent 1 – subi. 1
Annex I – part I – paragraph 6 – indent 1 – subi. 1
- a revised framework with clear due diligence transparency and information obligations;
Amendment 556 #
Motion for a resolution
Annex I – part II – paragraph 2
Annex I – part II – paragraph 2
The territorial scope of the future Digital Services Act should be extended to cover also the activities of companies and service providers established in third countries, when they offertarget or direct services or goods to consumers or users in the Union;
Amendment 558 #
Motion for a resolution
Annex I – part II – paragraph 3
Annex I – part II – paragraph 3
The Digital Services Act should maintainreview the derogation set out in the Annex of the E- Commerce Directive, and, in particular,f deemed necessary, revise them, while maintaining the derogation of contractual obligations concerning consumer contracts;
Amendment 561 #
Motion for a resolution
Annex I – part II – paragraph 4
Annex I – part II – paragraph 4
The Digital Services Act should maintain the possibility for Member States to setseek to further harmonise consumer protection across the Union, in alignment with Directive (EU) 2019/770 and Directive (EU) 2019/771 and to maintain a higher level of consumer protection and pursue legitimate public interest objectives in accordance with EU law;
Amendment 574 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 1
Annex I – part III – paragraph 1 – indent 1
- clarify to what extent “new digital services”, such as social media networks, collaborative economy services, search engines, wifi hotspots, online advertising, cloud services, content delivery networks, and domain name services fall within the scope of the Digital Services Act;
Amendment 577 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 2
Annex I – part III – paragraph 1 – indent 2
- clarify the nature of the content hosting intermediaries (text, images, video, or audio content) on the one hand, and commercial online marketplaces (selling physical and digital goods) on the other;
Amendment 588 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 4 a (new)
Annex I – part III – paragraph 1 – indent 4 a (new)
- refrain from seeking to define or act upon “harmful content”;
Amendment 598 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 5 a (new)
Annex I – part III – paragraph 1 – indent 5 a (new)
- seek to codify the decisions of the European Court of Justice, where needed, and while having due regard to the main different pieces of legislation which use these definitions;
Amendment 599 #
Motion for a resolution
Annex I – part IV – title
Annex I – part IV – title
IV. DUE DILIGENCETRANSPARENCY AND INFORMATION OBLIGATIONS
Amendment 604 #
Motion for a resolution
Annex I – part IV – paragraph 1 – introductory part
Annex I – part IV – paragraph 1 – introductory part
The Digital Services Act should introduce clear due diligence transparency and information obligations; those obligations should not create any derogations or new exemptions to the current liability regime and the secondary liability set out under Articles 12, 13, and 14 of the E-Commerce Directive and should cover the aspects described below:
Amendment 616 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 1 – indent 2
Annex I – part IV – paragraph 1 – subparagraph 1 – indent 2
- that measure should apply only to business-to-business relationships and should be without prejudice to the rights of users under the GDPR, as well as the right to internet anonymity or being an unidentified user; the new general information requirements should review and further enhance Articles 5, 6 and 10 of the E-Commerce Directive in order to align those measures with the information requirements established in recently adopted legislation, in particular the Unfair Contract Terms Directive5 , the Consumer Rights Directive and the Platform to Business Regulation. __________________ 5 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts, most recently amended by Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules (OJ L 328, 18.12.2019, p. 7).
Amendment 625 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 1
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 1
- to expressly set out in their contract terms and general conditions that service providers will not knowingly store illegal content;
Amendment 633 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 4
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 4
- to ensure that the contract terms and general conditions comply with these and all information requirements established by Union lawUnion law, including any and all relevant information requirements, including those the Unfair Contract Terms Directive, the Consumer Rights Directive and the GDPR;
Amendment 637 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 5
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 5
- to specify clearly and unambiguously in their contract terms and general conditions the exactmain parameters of their AI systems and how they can affect the choice or behaviour of their usersdetermining ranking content, and the reasons and importance of those parameters as opposed to other parameters.
Amendment 639 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 5 a (new)
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 5 a (new)
- start all Terms and Conditions agreements and all End-User Licensing Agreements with a summary statement based on a framework and document template, to be created by the Commission.
Amendment 661 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4 – indent 1
Annex I – part IV – paragraph 1 – subheading 4 – indent 1
- establish comprehensive rules on non-discrimination, transparency on the data set, oversight and risk assessment of algorithms for AI- driven services by national regulator authorities in order to ensure a higher level of consumer protection where there are gaps in current legislation;
Amendment 672 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 a (new)
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 a (new)
- be on a case by case basis and not require a blanket investigation of all AI systems
Amendment 674 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 b (new)
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 b (new)
- allow authorities to check algorithms when they have justified reasons to believe that it has algorithmic bias,
Amendment 676 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 c (new)
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 c (new)
- be subject to clear confidentiality and protection of trade secret rules;
Amendment 678 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 d (new)
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 d (new)
- ensure that consumers are protected by the right to be informed and the right to an explanation of AI services, in addition to the right to switch off or limit an AI system using personalization where possible;
Amendment 687 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 4
Annex I – part IV – paragraph 1 – subparagraph 4
The compliance of the due diligence provisions should be reinforced with effective, proportionate and dissuasive penalties, including the imposition of fines.
Amendment 710 #
Motion for a resolution
Annex I – part V – paragraph 1 – indent 4
Annex I – part V – paragraph 1 – indent 4
- introduce new transparency and independent national oversight of the content moderation procedures and tools related to the removal of illegal content online; such systems and procedures should be available for auditing and testing by independentnational authorities. of the country of origin;
Amendment 715 #
Motion for a resolution
Annex I – part V – paragraph 1 – indent 4 a (new)
Annex I – part V – paragraph 1 – indent 4 a (new)
- adapt the severity of the measures that need to be taken by service providers to the seriousness of the infringement;
Amendment 717 #
Motion for a resolution
Annex I – part V – paragraph 1 – indent 4 b (new)
Annex I – part V – paragraph 1 – indent 4 b (new)
- ensure that the access and removal of illegal content does not require the closure of access to overall sites and services which are otherwise legal and only affect the exact noticed content.
Amendment 725 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 2
Annex I – part V – paragraph 2 – indent 2
- rank different types of providers, sectors and/or illegal content in order to appreciate the seriousness of the infringement;
Amendment 736 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 7
Annex I – part V – paragraph 2 – indent 7
- require notices to be sufficiently precise and adequately substantiated so as to allow the service provider receiving them to take an informed and diligent decision as regards the effect to be given to the notice and specify the requirements necessary to ensure that notices are of a good quality, thereby enabling a swift removal of illegal content; such requirement should include the name and contact details of the notice provider, the link (URL) to the allegedly illegal content in question, the stated reason for the claim including an explanation of the reasons why the notice provider considers the content to be illegal, and if necessary, depending on the type of content, additional evidence for the claim;
Amendment 742 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 8
Annex I – part V – paragraph 2 – indent 8
- allow for the submission of anonymous complaintsnotice provider to provide their contact details, without this being required, but while recording the IP address or other equivalent of the provider in order to prevent abuse;
Amendment 747 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 9
Annex I – part V – paragraph 2 – indent 9
- consider, when a complaint is not anonymous, a declaration of good faith that the information provided is accurate;
Amendment 758 #
Motion for a resolution
Annex I – part V – paragraph 2 a (new)
Annex I – part V – paragraph 2 a (new)
The Digital Service Act notice-and-action mechanism should be based on the work of the Commission as carried out in 2012 and 2013, including the public consultations of a potential self-standing Directive on procedures for notifying and acting on illegal content hosted by online intermediaries.
Amendment 759 #
Motion for a resolution
Annex I – part V – paragraph 2 b (new)
Annex I – part V – paragraph 2 b (new)
The Digital Service Act notice-and-action mechanism should be binding only for illegal content. This, however, should not prevent online intermediaries being able to adopt a similar notice-and-action mechanism for other content.
Amendment 760 #
Motion for a resolution
Annex I – part V – paragraph 2 c (new)
Annex I – part V – paragraph 2 c (new)
The right to be notified before a decision is taken to remove a content and the right to issue a counter-notice by a user shall only be restricted or waived, where: (a) subject to a legal or regulatory obligation which requires online intermediation services to terminate the provision of the whole of its online intermediation services to a given user in a manner which does not allow it to respect that notice-and-action mechanism; (b) online intermediation services can demonstrate that the user concerned has repeatedly infringed the applicable terms and conditions, including by uploading multiple potential illegal contents.
Amendment 771 #
Motion for a resolution
Annex I – part V – subheading 2 – indent 4 a (new)
Annex I – part V – subheading 2 – indent 4 a (new)
- an out-of-court dispute settlement mechanism should meet certain standards, notably in terms of procedural fairness, a presumption of innocence or lack of malicious intent by the content provider and that abuse is avoided.
Amendment 782 #
Motion for a resolution
Annex I – part V – paragraph 3 – indent 5
Annex I – part V – paragraph 3 – indent 5
- the description of the content moderation model applied by the hosting intermediary, as well as any algorithmic decision making which influences the content moderation process.
Amendment 794 #
Motion for a resolution
Annex I – part V – paragraph 5
Annex I – part V – paragraph 5
The Digital Services Act should address the lack of legal certainty regarding the concept of active vs passive hosts. The revised measures should clarodify if interventions by hosting providers having editorial functions and a certain “degree of control over the data,” through tagging, organising, promoting, optimising, presenting or otherwise curating specific content for profit- making purposes and which amounts to adoption of the third-party content as one’s own (as judged by average users or consumers) should lead to a loss of safe harbour provisions due to their active naturethe jurisprudence of the European Court of Justice on the matter.
Amendment 799 #
Motion for a resolution
Annex I – part V – paragraph 5 a (new)
Annex I – part V – paragraph 5 a (new)
The Digital Service Act should maintain its protections of non-active providers and other backend and infrastructure services which are not party to the contractual relations between online intermediaries and its business or private customers. Such backend services should not be held liable for actions which they did not have an active overarching decision making role and which merely implement decisions by the online intermediaries or its customers.
Amendment 800 #
Motion for a resolution
Annex I – part V – paragraph 5 b (new)
Annex I – part V – paragraph 5 b (new)
New proposals of obligations on content management and moderation, if deemed necessary beyond a notice-and-action mechanism, should be only possible within the framework of the suggested regulation on ex-ante measures for significant market players.
Amendment 805 #
Motion for a resolution
Annex I – part V – paragraph 6 a (new)
Annex I – part V – paragraph 6 a (new)
The Digital Service Act, however, may allow for voluntary actions which would allow for online intermediaries to take social responsibility without losing the protections of article 14.
Amendment 824 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 5
Annex I – part VI – paragraph 2 – indent 5
- once products have been identified as unsafe by the Union’s rapid alert systems or by consumer protection authorities, it should be compulsory to remove products from the marketplace within 24 hourstwo working days of receiving notification;
Amendment 835 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 7 a (new)
Annex I – part VI – paragraph 2 – indent 7 a (new)
- explore the option that suppliers which are established in a third country have to designate a legal representative, established in the Union, who can be held accountable for the selling of products to European consumers which do not comply with Union rules of safety;
Amendment 838 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 8
Annex I – part VI – paragraph 2 – indent 8
- address the liability for online marketplaces if the online marketplace has not informed the consumer that a third party is the actual supplier of the goods or services, thus making the marketplace contractually liable vis-à-vis the consumer; liability should also be considered in case the marketplace willingly provides misleading information, guarantees, or statements;
Amendment 861 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 2
Annex I – part VII – paragraph 2 – indent 2
- empower regulatory authorities to issue orders prohibiting undertakings, which have been identified as “systemic platforms”, from the following practices, inter alia:such a mechanism should allow the national regulatory authority of the country of origin to impose remedies on these companies in order to address market failures, based on the conditions within the legal instrument and a closed list of positive and negative actions. This report should not prejudge of this list and the impact assessment should make a thorough analysis of the different issues observed on the market so far such as: - discrimination in intermediary services; - making the use of data for making market entry by third parties more difficult; and engaging in practices aimed at locking-in consumers; undertakings should be given the possibility to demonstrate that the behaviour in question is justified, yet they should bear the burden of proof for this;- lack of interoperability and appropriate tools, data, expertise, and resources deployed to allow consumers switch between digital platforms or internet ecosystems - obligations on content management and moderation, such as content recommendations and personalisation of a user interface.
Amendment 867 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 2 a (new)
Annex I – part VII – paragraph 2 – indent 2 a (new)
- empower the Commission to impose further conditions and decisions in relation to the rules of competition, including on self-preferencing and overall vertical integration, while ensuring that both policy tools are completely independent;
Amendment 868 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 2 b (new)
Annex I – part VII – paragraph 2 – indent 2 b (new)
- reserve to the Commission the power to decide if an information society service provider is a “systemic platform” based on the conditions of the mechanism;
Amendment 870 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 3
Annex I – part VII – paragraph 2 – indent 3
Amendment 879 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 6
Annex I – part VII – paragraph 2 – indent 6
Amendment 891 #
Motion for a resolution
Annex I – part VIII – paragraph 1
Annex I – part VIII – paragraph 1
The Digital Services Act should strengthen the internal market clause as the cornerstone of the Digital Single Market by complementing it with a new cooperation mechanism aimed at improving the cooperation and upon request and voluntary mutual assistance between Member States, in particular between the home country where the service provider is established and the host country where the provider is offering its services.
Amendment 896 #
Motion for a resolution
Annex I – part VIII – paragraph 2
Annex I – part VIII – paragraph 2
The supervision and enforcement the Digital Services Act should be improved by the creation of central regulatory authority who should be responsiblegiving additional powers to the national regulator of the country of origin for overseeing compliance with the DSA and improve external monitoring, verification of platform activities, and better enforcement.
Amendment 902 #
Motion for a resolution
Annex I – part VIII – paragraph 3
Annex I – part VIII – paragraph 3
Amendment 906 #
Motion for a resolution
Annex I – part VIII – paragraph 4
Annex I – part VIII – paragraph 4
The central regulator should coordinateCommission, through the Joint Research Centre, should offer their expertise and analysis upon request, including aid during investigations, to the work of the different authorities dealing with illegal content online, enforce compliance, fines, and be able to carry out auditing of intermediaries and platforms.
Amendment 916 #
Motion for a resolution
Annex I – part VIII – paragraph 5
Annex I – part VIII – paragraph 5
The central regulator should report to the Union institutions anCommission could maintain a ‘Platform Scoreboard’ with relevant information on the performance of online platforms.