112 Amendments of Alexandra GEESE related to 2020/2018(INL)
Amendment 54 #
Motion for a resolution
Recital E a (new)
Recital E a (new)
Ea. whereas content hosting intermediaries often take voluntary decisions regarding the legality of content and employ automated content recognition tools which raises concerns as regards the rule of law and the right to an effective remedy, in contravention of Article 52.1 of the European Charter of Fundamental Rights, stating that any limitation on the exercise of the rights and freedoms must be provided for by law;
Amendment 56 #
Motion for a resolution
Recital E b (new)
Recital E b (new)
Eb. whereas automated content recognition tools replicate, reinforce and prolong pre-existing biases, discrimination, errors and assumptions about individuals or demographic groups on the basis of gender, race, religion, political opinion or social origin;
Amendment 58 #
Motion for a resolution
Recital E c (new)
Recital E c (new)
Ec. whereas internal rules, such as terms and conditions or community guidelines, of systemic operators are determined unilaterally, whereby users often cannot access the platform of the operator without accepting its internal rules and have to waive all rights and remedies towards the operator;
Amendment 59 #
Motion for a resolution
Recital E d (new)
Recital E d (new)
Ed. whereas the activity of profiling coupled with targeted advertisements not only undermines the democratic society, but also leads to an unfair competitive advantage for dominant private actors collecting large amounts of data;
Amendment 60 #
Motion for a resolution
Recital E e (new)
Recital E e (new)
Ee. whereas the choice of algorithmic tools for recommendation systems raises accountability and transparency concerns; therefore stresses the need to guarantee the possibility of users to choose whether they want recommendations and personalisation by opting in to such services;
Amendment 75 #
Motion for a resolution
Paragraph 2
Paragraph 2
2. Recognises the importance of the legal framework set out by the E- Commerce Directive in the development of online services in the Union and in particular its internal market clause, through which home country control and the obligation on Member States to ensure the free movement of information society services have been established; calls however on an update of the territorial scope to include information society services not established in the Union, where their activities are related to: (a) the offering of goods or services, irrespective of whether a payment is required, to consumers or users in the Union; or (b) the monitoring of their behaviour as far as their behaviour takes place within the Union;
Amendment 90 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Considers that the main principles of the E-Commerce Directive, such as the internal market clause, freedom of establishment and the prohibition on imposing a general monitoring obligation should be maintained; underlines that the principle of “what is illegal offline is also illegal online”, the rights and freedoms guaranteed under the Charter of Fundamental Rights as well as the principles of consumer protection and user safety, should also become guiding principles of the future regulatory framework;
Amendment 118 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Takes the view that a level playing field in the internal market between the platform economy and the "traditional" offline economy, based on the same rights and obligations for all interested parties - consumers and businesses - is needed; considers that social protection and social rights of workers, especially of platform or collaborative economy workers should be properly addressed in a specific instrument, accompanying the future regulatory framework; asks the Commission to introduce further information obligation for collaborative economy platforms in line with data protection rules, as it is essential for local authorities in order to ensure the availability of affordable housing;
Amendment 142 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Considers that the Digital Services Act should be based on public values of the Union protecting citizens’ rights and should aim to foster the creation of a rich and diverse online ecosystem with a wide range of online services, favourabla competitive digital environment and legal certainty to unlock the full potential of the Digital Single Market;
Amendment 164 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Notes that information society services providers, and in particular online platforms and social networking sites - because of their wide-reaching ability to reach and influence broader audiences, behaviour, opinions, and practices - bear significant social responsibility in terms ofshould comply with Union law to protecting users and society at large and preventing their services from being exploited abusively.;
Amendment 184 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Recalls that recent scandals regarding data harvesting and selling, Cambridge Analytica, fake news, politicaldisinformation, targeted advertising and, voter manipulation and a host of other online harms (from hate speech to the broadcast of terrorism) have shown the need to revisit the existing rules and reinforce fundamental rights;
Amendment 191 #
Motion for a resolution
Paragraph 10
Paragraph 10
10. Stresses that the Digital Services Act should achieve the right balance between guarantee bothe internal market freedoms and the fundamental rights, freedoms and principles set out in the Charter of Fundamental Rights of the European Union;
Amendment 194 #
Motion for a resolution
Paragraph 10 a (new)
Paragraph 10 a (new)
10a. Calls on the Commission to introduce minimum standards for contract terms and general conditions of content hosting providers and providers of content moderation tools to provide for safeguards for fundamental rights, in particular with regard to transparency, accessibility, fairness, predictability and non-discriminatory enforcement;
Amendment 223 #
Motion for a resolution
Paragraph 14
Paragraph 14
14. Calls on the Commission to require service providers to verify the information and identity of the business partners with whom they have a contractual commercial relationship, and to ensure that the information they provide is accurate and up-to-dateusers as defined in Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (‘P2B Regulation’) and to take reasonable measures to ensure that the information they provide is accurate and up-to-date, while preserving consumers’ anonymity; reminds that the verification of the identity of individual users would place extensive administrative burdens on EU start-ups and SMEs competing on a global market;
Amendment 236 #
Motion for a resolution
Paragraph 14 a (new)
Paragraph 14 a (new)
14a. Underlines the rights of users under the GDPR, as well as the right to internet anonymity or being an unidentified user; warns that ignoring the wishes of internet users to not disclose their identity might put certain groups in disadvantageous situations, including the work of independent media, or deprive vulnerable groups from adequate protection and security online;
Amendment 245 #
Motion for a resolution
Paragraph 15
Paragraph 15
15. Calls on the Commission to introduce enforceable obligations on internet service providers aimed at increasing transparency and, information and accountability; considers that these obligations should be enforced by appropriate, effective and dissuasive penalties;
Amendment 251 #
Motion for a resolution
Paragraph 15 a (new)
Paragraph 15 a (new)
15a. Calls on the Commission to introduce transparency and accountability requirements regarding automated decision-making processes of content hosting providers and providers of automated content recognition tools, including the public documentation of, at least, the existence and the functioning of content recognition technologies;
Amendment 252 #
Motion for a resolution
Paragraph 15 b (new)
Paragraph 15 b (new)
15b. Welcomes efforts to bring transparency to content removal; underlines that, in order to verify compliance with the rules, the requirement to publish periodic transparency reports should be mandatory and include, at least, the number of notices, type of entities notifying content, nature of the content subject of complaint, response time by the intermediary, the number of appeals;
Amendment 253 #
Motion for a resolution
Paragraph 15 c (new)
Paragraph 15 c (new)
15c. In order to verify such transparency reports and compliance with legal obligations, and in line with the Council of Europe Recommendation CM/Rec(2018)2, Member States should make available, publicly and in a regular manner, comprehensive information on the number, nature and legal basis of requests sent to intermediaries to restrict content or to disclose personal data, including those based on international mutual legal assistance treaties, and on steps taken as a result of those requests;
Amendment 259 #
Motion for a resolution
Paragraph 16
Paragraph 16
16. Stresses that existing obligations, set out in the E-Commerce Directive and the Directive 2005/29/EC of the European Parliament and of the Council (‘Unfair Commercial Practices Directiveʼ)3 on transparency of commercial communications and digital advertising should be strengthened; points out that pressing consumer protection concerns about profiling, targeting and personalised pricing and recommendations cannot only be addressed by transparency obligations and left to consumer choice alone; __________________ 3 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (OJ L 149, 11.6.2005, p. 22).
Amendment 264 #
Motion for a resolution
Paragraph 16 a (new)
Paragraph 16 a (new)
16a. Calls for transparency obligations for recommendation systems of content hosting providers including the public documentation of recommendation outputs and their audiences, content- specific ranking decisions and other interventions by the platform , the criteria and the reasoning behind those decisions as well as the organisational structures that control such systems which should take the form of real-time, high-level, anonymised data access through public API;
Amendment 269 #
Motion for a resolution
Subheading 4
Subheading 4
Amendment 273 #
Motion for a resolution
Paragraph 17
Paragraph 17
17. Believes that while AI-driven services, currently governed by the E- commerce Directive, have enormous potential to deliver benefits to consumers and service providers, the new Digital Services Act should also address the challenges they present in terms of ensuring non-discrimination, transparency and explainability of algorithms, as well as liability; points out the need to monitorfor audits of algorithms and to assess associated risksmandatory risk assessments of associated risks for individuals, groups and society at large, to use high quality and unbiased datasets, as well as to help individuals acquire access to diverse content, opinions, high quality products and services;
Amendment 290 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Considers that consumers should be properly informed in a timely, impartial, easily-readable, standardised and accessible manner and their rights should be effectively guaranteed when they interact with automated decision-making systems and other innovative, in particular as regards the right to an effective remedy, and other digital services or applications; believes that it should be possible for consumers to meaningfully contest, request checks and corrections of possible mistakes resulting from automated decisions, as well as to seek redress for any damage related to the use of automated decision-making systems;
Amendment 299 #
Motion for a resolution
Subheading 5
Subheading 5
Tackling Illegal ContentActivities Online
Amendment 305 #
Motion for a resolution
Paragraph 19
Paragraph 19
19. Stresses that the existence and spread of illegal contentactivities online is a severe threat that undermines citizens' trust and confidence in the digital environment, and which also harms the economic development of healthy platform ecosystems in the Digital Single Market and severely hampers the development of legitimate markets for digital services;
Amendment 322 #
Motion for a resolution
Paragraph 20
Paragraph 20
20. Notes that there is no ‘one size fits all’ solution to all types of illegal and harmful content and cases of misinformation onlinectivities; believes, however, that a more aligned approach at Union level, taking into account the different types of contentactivities, will make the fight against illegal content more effective;
Amendment 327 #
Motion for a resolution
Paragraph 20 a (new)
Paragraph 20 a (new)
20a. Underlines that illegal content should be removed where it is hosted, and that access providers shall not be required to block access to content;
Amendment 337 #
Motion for a resolution
Paragraph 21
Paragraph 21
21. Considers that voluntary actions and self-regulation by online platforms across Europe have brought some benefits, but additional measures areled to the removal of content without a clear legal basis and are in contravention of Article 52 of the Charter hence a clear legal framework is needed in order to ensure the swift detecnotification and removal of illegal content online;
Amendment 374 #
Motion for a resolution
Paragraph 23
Paragraph 23
23. Stresses that maintaining safeguards from the legal liability regime for hosting intermediaries with regard to user-uploaded content and the general monitoring prohibition set out in Article 15 of the E-Commerce Directive are still relevant and need to be preservedpivotal for ensuring the availability of content online and for protecting the fundamental rights of users and need to be preserved; reminds that in line with Directive (EU) 2018/1808 (AVMS Directive) ex-ante control measures do not comply with article 15 of the Directive 2000/31/EC;
Amendment 385 #
Motion for a resolution
Paragraph 23 a (new)
Paragraph 23 a (new)
23a. Asks the Commission to improve consumer rights in the future regulation, by introducing safeguards to prevent violations, which are missing from Directive 2000/31/EC; notes that this should include as a minimum internal and external dispute mechanism, and the clearly stated possibility of judicial redress;
Amendment 412 #
Motion for a resolution
Paragraph 25 a (new)
Paragraph 25 a (new)
25a. Calls for the Digital Services Act to address issues related to online marketplaces; asks for the full implementation of Union product safety and chemicals legislation and calls for a review on the General Product Safety Directive and the Product Liability Directive; calls on Member States to devote sufficient capacities to improve the enforcement of EU product safety and chemicals legislation and for the Commission to provide adequate support in doing so;
Amendment 420 #
Motion for a resolution
Paragraph 26 a (new)
Paragraph 26 a (new)
26a. Stresses that consumers should be equally safe whether shopping online or in brick-and mortar shops; stresses that the Digital Services Act must set up clear obligations for online platforms and create an adapted regime for online marketplaces similar to brick-and mortar shops; calls on Member States to undertake more joined market surveillance actions and to step up collaboration with customs authorities to check the safety of products sold online before they reach consumers;
Amendment 425 #
Motion for a resolution
Paragraph 26 b (new)
Paragraph 26 b (new)
26b. Notes that the amount of free returns of goods sold online has been increasing over the years, leading to costs that create barriers for SMEs and start- ups, as well as considerable costs to the environment and society as a whole, which are currently not being reflected in any way to the consumer; considers that the environmental impact of delivery methods and packaging from recycled materials should also be taken into consideration by consumers when making purchases;
Amendment 427 #
Motion for a resolution
Paragraph 26 c (new)
Paragraph 26 c (new)
26c. Acknowledging the importance of the right of withdrawal for online or off- premises purchases, calls upon the Commission to enable consumers to make better informed choices by enlarging the information available to them on the cost of the return of goods for the company, environment and society as a whole;
Amendment 462 #
Motion for a resolution
Paragraph 28 a (new)
Paragraph 28 a (new)
28a. Calls on the Commission to introduce an obligation for systemic platforms to unbundle hosting and content moderation activities thereby allowing third parties to offer content moderation or curation services to the platforms’ users;
Amendment 463 #
Motion for a resolution
Paragraph 28 b (new)
Paragraph 28 b (new)
28b. Underlines that interoperability between competing or complementary products and services is key in a free and competitive market to enable choice for users and innovative services, and allow them to easily communicate with users of other providers’ services, thereby incentivising systemic platforms to improve their service quality;
Amendment 464 #
Motion for a resolution
Paragraph 28 c (new)
Paragraph 28 c (new)
28c. Calls on the Commission to introduce an obligation for systemic intermediaries with significant market power to make available and document tools to allow third-parties to interoperate with their main functionalities or to act on an user’s behalf, whereby intermediaries may not share, retain, monetize, or use any of the data they receive in the context of interoperability activities from third- parties, and intermediaries and third parties must protect users’ privacy and must respect the GDPR and other relevant Union legislation;
Amendment 465 #
Motion for a resolution
Paragraph 28 d (new)
Paragraph 28 d (new)
28 d. Recommends that providers which support a single sign-on service with a dominant market share should be required to also support at least one open and federated identity system based on a non-proprietary framework;
Amendment 483 #
Motion for a resolution
Paragraph 30 a (new)
Paragraph 30 a (new)
30a. Underlines that part of the investigative powers of the authority should be the right to conduct audits; considers in this regard that it is essential for the software documentation, the algorithms and data sets used to be fully accessible to the authority, while respecting Union law;
Amendment 485 #
Motion for a resolution
Paragraph 30 b (new)
Paragraph 30 b (new)
30b. Stresses that next to corrective powers, part of the enforcement powers of the authority should also be the right to issue fines of up to 30 000 000 EUR, or in the case of an undertaking, up to 5 % of the total worldwide annual turnover;
Amendment 488 #
Motion for a resolution
Paragraph 31
Paragraph 31
31. Takes the view that the central regulatory authority should prioritisfacilitate cooperation between Member States to address complex cross-border issues by working in close cooperation with a network of independent National Enforcement Bodies (NEBs); notes that the authority should be responsible in case of different decisions in more than one Member State, as well as at the request of the majority of the NEBs;
Amendment 494 #
Motion for a resolution
Paragraph 31 a (new)
Paragraph 31 a (new)
31a. Calls for the board to facilitate the creation and maintenance of a European research repository that would combine data from multiple platforms to facilitate appeals processes and enable regulators, researchers and NGOs to review and analyse platform decisions;
Amendment 496 #
Motion for a resolution
Paragraph 31 b (new)
Paragraph 31 b (new)
31b. Calls for the establishment of socially representative and diverse, in particular gender balanced, co-regulatory social media councils as a multi- stakeholder mechanism, which would provide for an open, transparent, accountable and participatory forum to address content moderation principles; considers that these social media councils should issue guidance, opinions and expertise;
Amendment 506 #
Motion for a resolution
Annex I – part I – paragraph 1
Annex I – part I – paragraph 1
The Digital Services Act should contribute to the strengthening of the internal market by ensuring the free movement of digital services, while at the same time guaranteeing a high level of consumer protection, including the improvement of users’ rights, freedoms and safety online;
Amendment 511 #
Motion for a resolution
Annex I – part I – paragraph 2
Annex I – part I – paragraph 2
The Digital Services Act should guarantee that online and offline economic activities are treated equally and on a level playing field which fully reflects the principle that “what is illegal offline is also illegal online” and that all rights and freedoms offline should also be guaranteed online;
Amendment 517 #
Motion for a resolution
Annex I – part I – paragraph 4
Annex I – part I – paragraph 4
The Digital Services Act should respect the broad framework of fundamental European rights of users and consumers, such as the protection of privacy, non-discrimination, dignity, fairness and free speech, freedom of expression and the right to an effective remedy;
Amendment 533 #
Motion for a resolution
Annex I – part I – paragraph 6 – indent 1 – subi. 2
Annex I – part I – paragraph 6 – indent 1 – subi. 2
- clear and detailed procedures and measures related to the removal of illegal content online, including a differentiated, harmonised legally-binding European notice-and action mechanism;
Amendment 551 #
Motion for a resolution
Annex I – part II – paragraph 1
Annex I – part II – paragraph 1
In the interest of legal certainty, the Digital Services Act should clarify which digital services fall within its scope. The new legal act should follow the horizontal nature of the E-Commerce Directive and apply not only to online platforms but to all digital services, which are not covered by specificcomplementing other legislation;
Amendment 555 #
Motion for a resolution
Annex I – part II – paragraph 2
Annex I – part II – paragraph 2
The territorial scope of the future Digital Services Act should be extended to cover also the activities of companies and service providerinformation society services established in third countries, whenre they offer services or goods to consumers or users ir activities are related to the offering of services or goods to consumers or users in the Union, irrespective of whether a payment is required, or the monitoring of their behaviour as far as their behaviour takes place within the Union;
Amendment 569 #
Motion for a resolution
Annex I – part II – paragraph 7
Annex I – part II – paragraph 7
The Digital Services Act should apply without prejudice to the rules set out in other instruments, such as the General Data Protection Regulation2 (“GDPR”), the Copyright Directive3 and the Audio Vvisual Media Services Directive4 . __________________ 2Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1). 3Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (OJ L 130, 17.5.2019, p. 92). 4Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (OJ L 95, 15.4.2010, p. 1).
Amendment 575 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 1
Annex I – part III – paragraph 1 – indent 1
- clarify to what extent “new digital services”, such as social media networks, collaborative economy services, search engines, wifi hotspots, online advertising, cloud services, content delivery networks, and domain name serviceweb hosting, messaging services and content delivery networks fall within the scope of the Digital Services Act;
Amendment 585 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 4
Annex I – part III – paragraph 1 – indent 4
- clarify of what falls within the remit of the "illegal content” definition making it clear that athis includes unlawful offers for sale in violation of EU rules on consumer protection, product safety or the offer or sale of food or tobacco products and counterfeit medicines, also falls within the definition of illegal content;
Amendment 589 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 5
Annex I – part III – paragraph 1 – indent 5
- define “systemic operator” by establishing a set of clear economic indicators that allow regulatory authorities to identify platforms with a “gatekeeper” role playing a systemic role in the online economy; such indicators could include considerations such as whether the undertaking is active to a significant extent on multi-sided markets, the size of its network (number of users), its financial strength, access to data, vertical integration, the importance of its activity for third parties’ access to supply and markets, whether the undertaking has a significant impact on the exercise of fundamental rights and freedoms as well as access to information in our society, etc.
Amendment 603 #
Motion for a resolution
Annex I – part IV – paragraph 1 – introductory part
Annex I – part IV – paragraph 1 – introductory part
The Digital Services Act should introduce clear due diligence transparency and information obligations rather than a general duty of care; those obligations should not create any derogations or new exemptions to the current liability regime and the secondary liability set out under Articles 12, 13, and 14 of the E-Commerce Directive and should cover the aspects described below:
Amendment 612 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 1 – indent 1
Annex I – part IV – paragraph 1 – subparagraph 1 – indent 1
- the information requirements in Article 5 of the E-Commerce Directive should be reinforced and the “Know Your Business Customer” principle should be introduced for business users; services providers should verify the identity of their business partnusers, including their company registration number or any equivalent means of identification including, if necessary, the verified national identity of their ultimate beneficial owner; that information should be accurate and up-to- date, and service providers should not be allowed to provide their services when the identity of their business customer is false, misleading or otherwise invalid;
Amendment 617 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 1 – indent 2
Annex I – part IV – paragraph 1 – subparagraph 1 – indent 2
- thatis measure should apply only to business-to-business relationships and should be without prejudice to the rights of userdata subjects under the GDPR, as well as the right to internetconsumer anonymity or being an unidentified user; the new general information requirements should review and further enhance Articles 5 and 10 of the E- Commerce Directive in order to aligncomplement those measures with the information requirements established in recently adopted legislation, in particular the Unfair Contract Terms Directive5 , the Consumer Rights Directive and the Platform to Business Regulation. __________________ 5 Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts, most recently amended by Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules (OJ L 328, 18.12.2019, p. 7).
Amendment 622 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – introductory part
Annex I – part IV – paragraph 1 – subparagraph 2 – introductory part
Amendment 624 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 1
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 1
- to expressly set out in their contract terms and general conditions that service providers will not store illegal content;uploading illegal content bears the full consequences of the applicable law.
Amendment 629 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 2 a (new)
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 2 a (new)
- to ensure that the contract terms and general conditions comply with fundamental rights standards;
Amendment 630 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 3
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 3
Amendment 632 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 4
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 4
- to ensure that the contract terms and general conditions comply with these and all information requirements established by Union law, including the Unfair Contract Terms Directive, the Unfair Commercial Practices Directive, the Consumer Rights Directive and the GDPR;
Amendment 640 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 5 a (new)
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 5 a (new)
- to notify users whenever they change their terms of service or community standards and to provide meaningful explanation about any substantial changes to terms of service.
Amendment 643 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 3 – indent 2
Annex I – part IV – paragraph 1 – subheading 3 – indent 2
- Building upon Article 6 of the E- Commerce Directive, the new measures should establish a new framework for Platform to Consumer relations on transparency provisions regarding advertising, digital nudging and preferential treatment; paid advertisements or paid placement in a ranking of search results should be identified in a clear, concise, and intelligible manner in line with Directive (EU) 2019/2161;
Amendment 657 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4
Annex I – part IV – paragraph 1 – subheading 4
4. Artificial Intelligence and machine learningContent moderation, prioritisation and personalisation
Amendment 666 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4 – indent 2 a (new)
Annex I – part IV – paragraph 1 – subheading 4 – indent 2 a (new)
- create a recurring risk assessment obligation for automated decision-making tools; such a provision would be agreed after consulting with content hosting providers and other stakeholders and its implementation will be monitored by the authority of the legally accountable, competent Member State or, for providers active in more than one country, the European authority;
Amendment 669 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4 – indent 3
Annex I – part IV – paragraph 1 – subheading 4 – indent 3
- establish the principle of safety and security design and by default;
Amendment 680 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 5
Annex I – part IV – paragraph 1 – subheading 5
Amendment 681 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 5
Annex I – part IV – paragraph 1 – subheading 5
Amendment 696 #
Motion for a resolution
Annex I – part V – paragraph 1 – indent 2
Annex I – part V – paragraph 1 – indent 2
- enhance the central role played by onlinthe intermediariesnet in facilitating the public debate and the free dissemination of facts, opinions, and ideas;
Amendment 702 #
Motion for a resolution
Annex I – part V – paragraph 1 – indent 3
Annex I – part V – paragraph 1 – indent 3
- preserve the underlying legal principle that online intermediaries should not be held directly liable for the acts of their users and that online intermediaries can continue moderating legal content under fair accessible, predictable and transparent terms and conditions of service, provided that they are applicable in a non- discriminatory manner;
Amendment 720 #
Motion for a resolution
Annex I – part V – paragraph 2 – introductory part
Annex I – part V – paragraph 2 – introductory part
The Digital Services Act should establish a differentiated, harmonised and legally enforceable notice- and-action mechanism based on a set of clear processes and precise timeframes for each step of the notice-and-action procedure. That notice- and-action mechanism should:
Amendment 730 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 2 a (new)
Annex I – part V – paragraph 2 – indent 2 a (new)
- offer different notification categories for different types of illegal content;
Amendment 731 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 3
Annex I – part V – paragraph 2 – indent 3
- create easily accessible, reliable and user-friendly procedures tailored to the type of content;
Amendment 735 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 6
Annex I – part V – paragraph 2 – indent 6
- guarantee that notices, unless being issued by a judicial authority, will not automatically trigger legal liability nor should they impose any removal requirement, for specific pieces of the content or for the legality assessment;
Amendment 737 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 7
Annex I – part V – paragraph 2 – indent 7
- specify the requirements necessary to ensure that notices are of a good quality, thereby enabling a swift removal of illegal content; such a requirement should include the name and contact details of the notice provider, the linkocation of (URL) to the allegedly illegal content in question, an indication of the time and date when the alleged wrongdoing was committed, the stated reason for the claim including an explanation of the reasons why the notice provider considers the content to be illegal, and if necessary, depending on the type of content, additional evidence for the claim, a declaration of good faith that the information provided is accurate and information on how to issue a counter- notice;
Amendment 744 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 8
Annex I – part V – paragraph 2 – indent 8
- allow for the submission of anonymous complaints, unless in cases of violations of personality rights or intellectual property rights;
Amendment 746 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 9
Annex I – part V – paragraph 2 – indent 9
Amendment 748 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 10
Annex I – part V – paragraph 2 – indent 10
- set up safeguards and provide for sanctions to prevent abusive behaviour by users who systematically and repeatedly and with mala fide submit wrongful or abusive notices;
Amendment 749 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 11
Annex I – part V – paragraph 2 – indent 11
- create an obligation for the online intermediaries to verify the notified content and reply to the notice provider and the content uploader with a reasoned decision; such a requirement to reply should include the reasoning behind the decision, how the decision was made, if the decision was made by a human or an automated decision agent and information about the possibility to appeal this decision by either party with the intermediary, courts or other entities;
Amendment 755 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 12 a (new)
Annex I – part V – paragraph 2 – indent 12 a (new)
- create an obligation for intermediaries to publish information about their procedures and time frames for intervention by interested parties, including the time before a notification is sent to the content uploader, the time for the content uploader to respond with a counter-notification, the average and maximum time for a decision by the platform for categories of cases, the time at which the intermediary will inform both parties about the result of the procedure, the time for different forms of appeal against the decision.
Amendment 764 #
Motion for a resolution
Annex I – part V – subheading 2 – indent 2
Annex I – part V – subheading 2 – indent 2
- The providers of the content that is being flagged as illegal should be immediately informed of the notice and, that being the case, of the reasons and decisions taken to remove, suspend or disable access to the content; all parties should be duly informed of all existing available legal options and mechanisms to challenge this decision; in complex areas of law mainly involving two parties external to the provider, such as alleged defamation or copyright infringements, a notice-and-notice system is more appropriate, with additional safeguards put in place;
Amendment 768 #
Motion for a resolution
Annex I – part V – subheading 2 – indent 4
Annex I – part V – subheading 2 – indent 4
- If the redress and counter-notice have established that the notified activity or information is not illegal, the online intermediary should restore the content that was removed or suspended without undue delay or allow for the re-upload by the user, without prejudice to the platform'online intermediary’s terms of service.
Amendment 773 #
Motion for a resolution
Annex I – part V – subheading 2 a (new)
Annex I – part V – subheading 2 a (new)
2a. Independent dispute settlement bodies
Amendment 774 #
Motion for a resolution
Annex I – part V – subheading 2 a (new)
Annex I – part V – subheading 2 a (new)
Amendment 775 #
Motion for a resolution
Annex I – part V – subheading 2 b (new)
Annex I – part V – subheading 2 b (new)
Amendment 776 #
Motion for a resolution
Annex I – part V – paragraph 3 – introductory part
Annex I – part V – paragraph 3 – introductory part
The notice-and-action mechanisms should be transparent and publicly available to any interested party; to that end, online intermediaries and Member States should be obliged to publish annual reports with. Online intermediaries’ reports should be standardized and contain information on:
Amendment 779 #
Motion for a resolution
Annex I – part V – paragraph 3 – indent 1 a (new)
Annex I – part V – paragraph 3 – indent 1 a (new)
- the response time per type of content;
Amendment 780 #
Motion for a resolution
Annex I – part V – paragraph 3 – indent 4 a (new)
Annex I – part V – paragraph 3 – indent 4 a (new)
- the number of erroneous takedowns;
Amendment 783 #
Motion for a resolution
Annex I – part V – paragraph 3 – indent 5
Annex I – part V – paragraph 3 – indent 5
- the description of the content moderation model applied by the hosting intermediary, as well as any algorithmic decision making and its functioning and logic which influences the content moderation process.
Amendment 784 #
Motion for a resolution
Annex I – part V – paragraph 3 – indent 5 a (new)
Annex I – part V – paragraph 3 – indent 5 a (new)
- an obligation for intermediaries to provide the aggregated data of transparency reports via a publicly available real-time API. Such an API should be standardised by the European regulator to allow for comparability across providers.
Amendment 787 #
Motion for a resolution
Annex I – part V – paragraph 3 – indent 5 b (new)
Annex I – part V – paragraph 3 – indent 5 b (new)
- Member States’ reports should contain information on: the number, nature and legal basis of content restriction requests sent to intermediaries; on the actions taken as a result of those requests.
Amendment 795 #
Motion for a resolution
Annex I – part V – paragraph 5
Annex I – part V – paragraph 5
The Digital Services Act should address the lack of legal certainty regardconsider replacing the concept of active vs passive hosts. The revised measures should clarify if interventions by hosting providers having editorial functions and a certain “degree of control over the data,” through tagging, organising, promoting, optimising, presenting or otherwise curating specific content for profit- making purposes and which amounts to adoption of the third-party content as one’s own (as judged by average users or consumers)creating the content or having a certain “degree of contribution to the illegality of the content” and which amounts to adoption of the third-party content as one’s own (as judged by average users or consumers), as well as the question whether a provider is optimizing economic models of services in ways that bring inherent risks of illegal or harmful content/activity and/or fundamental rights and freedoms should lead to a loss of safe harbour provisions due to their active nature.
Amendment 801 #
Motion for a resolution
Annex I – part V – paragraph 6
Annex I – part V – paragraph 6
The Digital Services Act should maintain the ban on general monitoring obligation under Article 15 of the current E- Commerce Directive clarifying however that the indiscriminate verification and analysis of all content or communications hosted by an information society service provider also falls within the definition of general monitoring. Online intermediaries should not be subject to general monitoring obligations.
Amendment 807 #
Motion for a resolution
Annex I – part VI – paragraph 1
Annex I – part VI – paragraph 1
The Digital Services Act should propose specific rules for online market places for the online sale, promotion or supply of products and services to consumers.
Amendment 815 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 4
Annex I – part VI – paragraph 2 – indent 4
- ensure that online marketplaces remove any misleading information given by the supplier or by customers, including misleading guarantees and statements made by the supplier, or otherwise would become liable;
Amendment 822 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 5
Annex I – part VI – paragraph 2 – indent 5
- once products have been identified as unsafe by the Union’s rapid alert systems, by national market surveillance authorities, by customs authorities or by consumer protection authorities, it should be compulsory to remove products from the marketplace within 24 hours;
Amendment 826 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 5 a (new)
Annex I – part VI – paragraph 2 – indent 5 a (new)
- include an obligation to protect users, so that in case an online marketplace has obtained credible evidence of illegal activities on its platform, but fails to take adequate measures for the protection of the online consumer, it becomes liable for consumers’ damages resulting from that failure;
Amendment 845 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 9 a (new)
Annex I – part VI – paragraph 2 – indent 9 a (new)
- ensure that online market places provide clear and easily understandable information to consumers on the impact of e-commerce on the environment; more particularly, online market places should be obliged to provide information on the use of sustainable and efficient product delivery methods, of environmentally sound packaging, as well as on the carbon footprint and other environmental impacts of returning unwanted items, involving double transportation or requiring disposal rather than resale.
Amendment 850 #
Motion for a resolution
Annex I – part VII – paragraph 1
Annex I – part VII – paragraph 1
The Digital Services Act should put forward a proposal to ensure that the systemic role of specificcertain online platforms will not endanger the internal market by unfairly excluding innovative new entrants, including SMEs, and to provide for real consumer choice.
Amendment 860 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 1
Annex I – part VII – paragraph 2 – indent 1
- set up an ex-ante mechanism to prevent (instead of merely remedy) unfair market behaviour by “systemic platforms” in the digital world, building on the Platform to Business Regulation; such mechanism should allow regulatory authorities to impose remedies on these companies in order to address market failures, without the establishment of a breach of regulatorycompetition rules;
Amendment 880 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 6
Annex I – part VII – paragraph 2 – indent 6
- impose high levels of interoperability measures requiring “systemic platforms” to share appropriate tools, data, expertise, and resources deployed in order to limit the risks of users and consumers’ lock-in and the artificially binding users to one systemic platform with no possibility or incentives for switching between digital platforms or internet ecosystems. As part of those measures, the Commission should explore different technologies and open standards and protocols, including the possibility of a mechanical interface (Application Programming Interface) to be provided by systemic platforms, especially social media and messaging services, that allows users of competing platforms to dock on to the systemic platform and exchange information with it. Systemic platforms may not share, retain, monetize, or use any of the data that is received from third- parties during interoperability activities. Interoperability obligations should not limit, hinder or delay the ability of intermediaries to patch vulnerabilities.
Amendment 887 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 6 a (new)
Annex I – part VII – paragraph 2 – indent 6 a (new)
- put in place transparency obligations for recommendation systems of systemic providers including public documentation of rules and criteria for the functioning of recommendation algorithms, of recommendation outputs and their audiences, of content-specific ranking decisions and other interventions by the platform as well as of the organisational structures that control such systems, as well as real-time, high- level, anonymised data access through public APIs to verify the information provided in the public documentation.
Amendment 889 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 6 b (new)
Annex I – part VII – paragraph 2 – indent 6 b (new)
- create an unbundling remedy for hosting and content moderation activities thereby allowing third parties to offer content moderation or curation services to the platforms’ users. It should be designed as to address the contractual layer and the technical layer.
Amendment 890 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 6 c (new)
Annex I – part VII – paragraph 2 – indent 6 c (new)
- entrust the European Commission Directorate General for Competition with additional powers under Council Regulation (EC) No 1/2003 of 16 December 2002 on the implementation of the rules on competition laid down in Articles 101 and 102 of the Treaty to send, following a market investigation, recommendations to market players as a means to intervene before markets tip in favour of the incumbent platform and therefore prevent serious damage to competition and consumers.
Amendment 897 #
Motion for a resolution
Annex I – part VIII – paragraph 2
Annex I – part VIII – paragraph 2
The supervision and enforcement the Digital Services Act should be improved by the creation of a central regulatory authority who should be responsible for overseeing compliance with the DSA and improve external monitoring, verification of platform activities, and better enforcement.
Amendment 901 #
Motion for a resolution
Annex I – part VIII – paragraph 3
Annex I – part VIII – paragraph 3
The central regulatory authority should prioritisfacilitate cooperation between the Member States to address complex cross-border issues; to that end, it should work together with the network of independent NEBs and have detailed and extensive enforcement powers to launch initiatives and investigations into cross-border systemic issues.
Amendment 905 #
Motion for a resolution
Annex I – part VIII – paragraph 4
Annex I – part VIII – paragraph 4
The central regulator should coordinate the work of the different authorities dealing with illegal content online, enforce compliance, fines, and be able to carry out auditing of intermediaries and platforms; in case of disagreement of the NEBs, at the request of the majority of NEBs, or in case of issues relevant for more than one country it takes the final decision.
Amendment 910 #
Motion for a resolution
Annex I – part VIII – paragraph 4 a (new)
Annex I – part VIII – paragraph 4 a (new)
The investigative powers of the authority should include the right to conduct audits; in this regard it is essential for the software documentation, the algorithms and data sets used to be fully accessible to the authority, while respecting Union law.
Amendment 911 #
Motion for a resolution
Annex I – part VIII – paragraph 4 b (new)
Annex I – part VIII – paragraph 4 b (new)
The authority should facilitate and support the creation and maintenance of a European research repository that would combine data from multiple platforms to facilitate appeals processes and enable regulators, researchers and NGOs to review and analyse platform decisions.
Amendment 912 #
Motion for a resolution
Annex I – part VIII – paragraph 4 c (new)
Annex I – part VIII – paragraph 4 c (new)
Next to corrective powers, the enforcement powers of the authority should include the right to issue fines of up to 30 000 000 EUR, or in the case of an undertaking, up to 5 % of the total worldwide annual turnover.
Amendment 917 #
Motion for a resolution
Annex I – part VIII – paragraph 5
Annex I – part VIII – paragraph 5
The central regulator should report to the Union institutions and maintain a public ‘Platform Scoreboard’ with relevant information on the performance of online platforms.