62 Amendments of Maria da Graça CARVALHO related to 2020/2018(INL)
Amendment 51 #
Motion for a resolution
Recital E
Recital E
E. whereas in its communication to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 19 February 2020 “Shaping Europe’s digital future”, the Commission committed itself to adopting, as part of the Digital Services Act package, new and revised rules for online platforms and information service provider; to reinforcing the oversight over platforms’ content policies in the EU; and, to looking into ex ante rules to ensure that large platforms with significant network effects, acting as gatekeepers, remain fair and contestable for innovators, businesses, and new market entrants; including SMEs, start-ups, entrepreneurs and new market entrants; believes that the Digital Services Act should complement the existing legal framework together with other relevant legislation, such as rules on consumer protection, enforcement, product safety, market surveillance, competition, geo-blocking, audio-visual media services, copyright and the General Data Protection Regulation;
Amendment 133 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Considers that the Digital Services Act should be based on public values of the Union protecting citizens’ rights should aim to foster the creation of a rich and diverse online ecosystem with a wide range of online services, favourable digital environment and legal certainty to unlock the full potential of the Digital Single Market; believes that the EU should focus on removing existing obstacles in the Digital Single Market and on ensuring consumer and fundamental rights protection as one of the main objectives of the reform of the E-Commerce Directive; considers in this context that the Single Market objective can only be achieved if consumer trust is ensured; believes that the updated E-Commerce rules must clearly establish that consumer law and product safety requirements fall within their scope of application in order to ensure legal certainty;
Amendment 182 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Recalls that recent scandals regarding data harvesting and selling, Cambridge Analytica, fake news, political advertising and manipulation and a host of other online harms (from hate speech to the broadcast of terrorism) have shown the need to revisit the existing rules and reinforce protection of fundamental rights online;
Amendment 198 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Notes that the COVID-19 pandemic has shown how vulnerable EU consumers are to misleading trading practices by dishonest traders selling fake or illegal products online that are not compliant with Union safety rules or imposing unjustified and abusive price increases or other unfair conditions on consumers and therefore stresses the urgent need to set up clear rules in order to enhance consumer protection;
Amendment 285 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Considers that consumers should be properly informed and their rights should be effectively guaranteed when they interact with automated decision-making systems and other innovative digital services or applications; considers it essential that automatic decision-making systems do not generate unfairly biased outputs for consumers in the single market; believes that it should be always possible for consumers to be properly informed about interacting with automated decision-making, and about how to reach a human with decision- making powers to request checks and corrections of possible mistakes resulting from automated decisions, as well as to seek redress for any damage related to the use of automated decision-making systems;
Amendment 298 #
Motion for a resolution
Subheading 5
Subheading 5
Tackling Illegal and Harmful Content Online
Amendment 304 #
Motion for a resolution
Paragraph 19
Paragraph 19
19. Stresses that the existence and spread of illegal and harmful content online is a severe threat that undermines citizens' trust and confidence in the digital environment, and which also harms the economic development of healthy platform ecosystems in the Digital Single Market and severely hampers the development of legitimate markets for digital services;
Amendment 320 #
Motion for a resolution
Paragraph 20
Paragraph 20
20. Notes that there is no ‘one size fits all’ solution to all types of illegal and harmful content and cases of mdisinformation online; believes, however, that a more aligned approach at Union level, taking into account the different types of content and services offered by a platform, will make the fight against illegal content more effective;
Amendment 321 #
Motion for a resolution
Paragraph 20
Paragraph 20
20. Notes that there is no ‘one size fits all’ solution to all types of illegal and harmful content and, including cases of mdisinformation online; believes, however, that a more aligned approach at Union level, taking into account the different types of content, will make the fight against illegal and harmful content more effective;
Amendment 338 #
Motion for a resolution
Paragraph 21
Paragraph 21
21. Considers that voluntary actions and self-regulation by online platforms across Europe have brought some benefits, but additionaleen proven unsatisfactory and much stronger measures are needed in order to ensure the swift detection and removal of illegal and harmful content online;
Amendment 344 #
Motion for a resolution
Paragraph 21 a (new)
Paragraph 21 a (new)
21a. Believes that where intermediaries are established in a third country, they should designate a legal representative, established in the Union, who can be held accountable for the products they offer;
Amendment 361 #
Motion for a resolution
Paragraph 22
Paragraph 22
22. Calls on the Commission to address the increasing differences and fragmentations of national rules in the Member States and to propose concrete legislative measures including a notice- and-action mechanism, that can empower users to notify online intermediaries of the existence of potentially illegal or harmful online content or behaviour; is of the opinion that such measures would guarantee a high level of users' and consumers' protection while promoting consumer trust in the online economy;
Amendment 392 #
Motion for a resolution
Paragraph 24
Paragraph 24
24. Notes that while online platforms, such as online market places, have benefited both retailers and consumers by improving choice and lowering prices, at the same time, they have allowed sellers, in particular from third countries, to offer products which often do not comply with Union rules on product safety and do not sufficiently guarantee consumer rights; stresses, in this context, the need for a possibility to always identify manufacturers and sellers of products from third countries; underlines that if one of the services provided by a platform can be considered a marketplace ("hybrid platforms"), the rules should fully apply to that part of the business; and asks the online marketplaces to enhance their cooperation by exchanging information on the seller of these products with the market surveillance and the custom authorities;
Amendment 438 #
Motion for a resolution
Paragraph 27
Paragraph 27
27. Notes that, today, some markets are characterised by large platforms with significant network effects which are able to act as de facto “online gatekeepers” of the digital economy and create new bottlenecks through inflexible terms of access, limited access to operating systems´ functionalities or access to user transactions’ data;
Amendment 440 #
Motion for a resolution
Paragraph 27
Paragraph 27
27. Notes that, today, some markets are characterised by large platforms with significant network effects which are able to act as de facto “online gatekeepers” of the digital economy and asks the Commission to analyse the consequences this has for consumers, SMEs and the Single Market;
Amendment 452 #
Motion for a resolution
Paragraph 28
Paragraph 28
28. Considers that by reducing barriers to market entry and by regulating large platforms, an internal market instrument imposing ex-ante regulatory remedies on these large platformsplatforms with significant market power has the potential to open up markets to new entrants, including SMEs and start-ups, thereby promoting consumer choice and driving innovation beyond what can be achieved by competition law enforcement alone;
Amendment 454 #
Motion for a resolution
Paragraph 28
Paragraph 28
28. Considers that by reducing barriers to market entry and by regulating large platforms, an internal market instrument imposing ex-ante regulatory remedies on these large platforms has the potential to open up markets to new entrants, including SMEs, entrepreneurs and start-ups, thereby promoting consumer choice and driving innovation beyond what can be achieved by competition law enforcement alone;
Amendment 458 #
Motion for a resolution
Paragraph 28 a (new)
Paragraph 28 a (new)
28a. Considers that access to these platforms by other business actors shall be ensured in a fair way, avoiding discrimination and self-preferencing practices and the violation of normative regulatory principles; considers that the Commission in coordination with national regulatory authorities should establish mechanisms to conduct regular fully fledged market investigations on gatekeeper platforms to assess their compliance with Union competition laws and impose remedies when needed;
Amendment 461 #
Motion for a resolution
Paragraph 28 a (new)
Paragraph 28 a (new)
28a. Believes that the ex-ante regulatory instrument should ensure fair trading conditions on all platforms, including possible additional requirements – for example, a list of obligations /prohibitions – for those that play a gatekeeper role;
Amendment 475 #
Motion for a resolution
Paragraph 30
Paragraph 30
Amendment 480 #
Motion for a resolution
Paragraph 30
Paragraph 30
30. Considers that a central regulatory authority should be established which should be responsible for the oversight and compliance with the Digital Services Act and have supplementary powers to tackle cross-border issues; it should be entrusted with strong investigation and enforcement powers; stresses that cooperation between national as well as other Member States’ authorities, civil society and consumer organisations is of utmost importance for achieving effective enforcement;
Amendment 490 #
Motion for a resolution
Paragraph 31
Paragraph 31
31. Takes the view that the central regulatory authority should prioritise cooperation between Member States toCommission should address complex cross-border issues by working in close cooperation with a network of independent National Enforcement Bodies (NEBs);
Amendment 507 #
Motion for a resolution
Annex I – part I – paragraph 1
Annex I – part I – paragraph 1
The Digital Services Act should contribute to the strengthening of the internal market by ensuring the free movement of digital services, while at the same time guaranteeing a high level of consumer protection, includingand the improvement of users’ safety online;
Amendment 509 #
Motion for a resolution
Annex I – part I – paragraph 1 a (new)
Annex I – part I – paragraph 1 a (new)
The Digital Services Act should contribute to the removal of the existing unjustified obstacles to the digital single market, which many times arise from protectionist measures by Member States, as well as ensuring that no new barriers are created;
Amendment 510 #
Motion for a resolution
Annex I – part I – paragraph 2
Annex I – part I – paragraph 2
The Digital Services Act should guarantee that online and offline economic activities are treated equally and on a level playing field which fully reflects the principle that “what is illegal offline is also illegal online”; this principle does not however exclude that due to specific nature of online environment and easiness to manipulate with users on mass scale, certain activities allowed offline might not be allowed online;
Amendment 534 #
Motion for a resolution
Annex I – part I – paragraph 6 – indent 1 – subi. 2
Annex I – part I – paragraph 6 – indent 1 – subi. 2
- clear and detailed procedures and measures related to the removal of illegal and harmful content online, including a harmonised legally-binding European notice-and action mechanism;
Amendment 543 #
Motion for a resolution
Annex I – part I – paragraph 6 – indent 2
Annex I – part I – paragraph 6 – indent 2
- an internal market legal instrument in the form of a Regulation, based on Article 114 TFEU, imposing ex-ante obligations on large platforms with a gatekeeper role in the digital ecosystem, complemented by an effective institutional enforcement mechanism.
Amendment 550 #
Motion for a resolution
Annex I – part II – paragraph 1
Annex I – part II – paragraph 1
In the interest of legal certainty, the Digital Services Act should clarify which digital services fall within its scope. The new legal act should follow the horizontal nature of the E-Commerce Directive and apply not only to online platforms but to all digital services, which are not covered by specific legislation;
Amendment 572 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 1
Annex I – part III – paragraph 1 – indent 1
- clarify to what extent “"new digital services”", such as social media networks, collaborative economy services, search engines, wifi hotspots, online advertising, comparison tools, cloud services, content delivery networks, and domain name services fall within the scope of the Digital Services Act;
Amendment 583 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 4
Annex I – part III – paragraph 1 – indent 4
- clarify of what falls within the remit of the "illegal content”", definition making it clear that a violation of EU rules on consumer protection, product safety or the offer or sale of food or tobacco products and counterfeit medicines, also falls within the definition of illegal content; it is also necessary to clarify what falls under "harmful content" and "disinformation";
Amendment 590 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 5
Annex I – part III – paragraph 1 – indent 5
- define “"systemic operator”" by establishing a set of clear economic indicators that allow regulatory authorities to identify platforms which enjoy a significant market position with a “"gatekeeper”" role playing a systemic role in the online economy; such indicators could include considerations such as whether the undertaking is active to a significant extent on multi-sided markets, or has predominant influence over its users, the size of its network (number of users), its financial strength, access to data, accumulation of data, vertical integration, the importance of its activity for third parties’ access to supply and markets, etc.
Amendment 594 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 5
Annex I – part III – paragraph 1 – indent 5
- define “systemic operator” by establishing a set of clear economic indicators that allow regulatory authorities to identify platforms with a “gatekeeper” role playing a systemic role in the online economy; such indicators could include considerations such as whether the undertaking is active to a significant extent on multi-sided markets, or has predominant influence over its users, the size of its network (number of users), its financial strength, access to data, vertical integration, the importance of its activity for third parties’ access to supply and markets, etc.
Amendment 641 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 3
Annex I – part IV – paragraph 1 – subheading 3
3. Transparency rRequirements on commercial communications
Amendment 645 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 3 – indent 3
Annex I – part IV – paragraph 1 – subheading 3 – indent 3
- The transparency requirements should include the obligation to disclose who is paying for the advertising, including both direct and indirect payments or any other contributions received by service providers; those requirements should apply also to platforms, even if they are established in third countries; consumers and public authorities should be able to identify who should be held accountable in case of, for example, false or misleading advertisement; these transparency requirements should also empower advertisers vis-a-vis advertising services, when it comes to where and when ads are placed; more efforts are needed to make sure that illegal activities cannot be funded via advertising services;
Amendment 649 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 3 – indent 3 a (new)
Annex I – part IV – paragraph 1 – subheading 3 – indent 3 a (new)
- The transparency requirements should also apply to targeted adverts; criteria for profiling targeted groups and optimisation of advertising campaigns must be made clear to verify any abuse. Users should be aware and have previously given their consent if they are going to receive targeted adverts;
Amendment 651 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 3 – indent 3 b (new)
Annex I – part IV – paragraph 1 – subheading 3 – indent 3 b (new)
- Specific requirements in regard to behavioural advertising, including micro targeting, should be introduced in order to protect public interest; behavioural advertising based on certain characters, i.e. exposing mental or physical vulnerabilities, should not be allowed at all, while some other characteristics should be allowed only under the opt-in condition by the users;
Amendment 688 #
Motion for a resolution
Annex I – part V – title
Annex I – part V – title
V. MEASURES RELATED TO TACKLING ILLEGAL AND HARMFUL CONTENT ONLINE
Amendment 690 #
Motion for a resolution
Annex I – part V – paragraph 1 – introductory part
Annex I – part V – paragraph 1 – introductory part
The Digital Services Act should provide clarity and guidance regarding how online intermediaries should tackle illegal and harmful content online. The revised rules of the E- Commerce Directive should:
Amendment 694 #
Motion for a resolution
Annex I – part V – paragraph 1 – indent 1
Annex I – part V – paragraph 1 – indent 1
- clarify that any removal or disabling access to illegal or harmful content should not affect the fundamental rights and the legitimate interests of users and consumers;
Amendment 711 #
Motion for a resolution
Annex I – part V – paragraph 1 – indent 4
Annex I – part V – paragraph 1 – indent 4
- introduce new transparency and independent oversight of the content moderation procedures and tools related to the removal of illegal and harmful content online; such systems and procedures should be available for auditing and testing by independent authorities.
Amendment 723 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 1
Annex I – part V – paragraph 2 – indent 1
- apply to illegal online content or behaviour as well as to harmful content including disinformation;
Amendment 727 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 2
Annex I – part V – paragraph 2 – indent 2
- rank different types of providers, sectors and/or illegal and harmful content;
Amendment 732 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 4
Annex I – part V – paragraph 2 – indent 4
- allow users to easily notify by electronic means potentially illegal or harmful online content or behaviour to online intermediaries;
Amendment 738 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 7
Annex I – part V – paragraph 2 – indent 7
- specify the requirements necessary to ensure that notices are of a good quality, thereby enabling a swift removal of illegal content; such requirement should include the name and contact details of the notice provider, the link (URL) to the allegedly illegal or harmful content in question, the stated reason for the claim including an explanation of the reasons why the notice provider considers the content to be illegal, and if necessary, depending on the type of content, additional evidence for the claim;
Amendment 751 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 11
Annex I – part V – paragraph 2 – indent 11
- create an obligation for the online intermediaries to verify the notified content and reply in a timely manner to the notice provider and the content uploader with a reasoned decision;
Amendment 762 #
Motion for a resolution
Annex I – part V – subheading 2 – indent 1
Annex I – part V – subheading 2 – indent 1
- The decision taken by the online intermediary on whether or not to act upon content flagged as illegal or harmful should contain a clear justification on the actions undertaken regarding that specific content. The notice provider, where identifiable, should receive a confirmation of receipt and a communication indicating the follow-up given to the notification.
Amendment 763 #
Motion for a resolution
Annex I – part V – subheading 2 – indent 1
Annex I – part V – subheading 2 – indent 1
- The decision taken by the online intermediary on whether or not to act upon content flagged as illegal should contain a clear justification on the actions undertaken regarding that specific content. The notice provider, where identifiable, should receive a confirmation of receipt and a communication indicating the follow-up given to the notification.
Amendment 765 #
Motion for a resolution
Annex I – part V – subheading 2 – indent 2
Annex I – part V – subheading 2 – indent 2
- The providers of the content that is being flagged as illegal or harmful should be immediately informed of the notice and, that being the case, of the reasons and decisions taken to remove or disable access to the content; all parties should be duly informed of all existing available legal options and mechanisms to challenge this decision;
Amendment 770 #
Motion for a resolution
Annex I – part V – subheading 2 – indent 4
Annex I – part V – subheading 2 – indent 4
- If the redress and counter-notice have established that the notified activity or information is not illegal nor harmful, the online intermediary should restore the content that was removed without undue delay or allow for the re-upload by the user, without prejudice to the platform's terms of service.
Amendment 808 #
Motion for a resolution
Annex I – part VI – paragraph 1
Annex I – part VI – paragraph 1
The Digital Services Act should propose specific rules for online market places for the online sale of products and provision of services to consumers.
Amendment 811 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 3
Annex I – part VI – paragraph 2 – indent 3
- ensure that online marketplaces make it clear into which country the products are sold or services are being provided, regardless whether they are provided by that marketplace, a third party or a seller established inside or outside the Union;
Amendment 818 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 4 a (new)
Annex I – part VI – paragraph 2 – indent 4 a (new)
- ensure that online marketplaces foresee an easy to find specific contact point for consumers and national authorities for the notice of unsafe goods on their website;
Amendment 819 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 4 b (new)
Annex I – part VI – paragraph 2 – indent 4 b (new)
- ensure that online marketplaces have to check if a product is on the Union Rapid Alert System for dangerous non- food products (Rapex) before placing it on their website;
Amendment 827 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 5 a (new)
Annex I – part VI – paragraph 2 – indent 5 a (new)
- oblige online marketplaces to exchange information on repeat offenders and to take measures to avoid that goods taken down from one website reappear on other online marketplaces;
Amendment 829 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 6
Annex I – part VI – paragraph 2 – indent 6
- oblige online marketplaces to inform consumers of any safety issues and ofto enhance cooperation with national authorities and consumer associations on recalls and take any action required to ensure that recalls are carried out effectively;
Amendment 849 #
Motion for a resolution
Annex I – part VII – paragraph 1
Annex I – part VII – paragraph 1
The Digital Services Act should put forward a proposal to ensure that the systemic role of specific online platforms will not endanger the internal market by unfairly excluding innovative new entrants, including SMEs., entrepreneurs and start- ups, creating market failures;
Amendment 856 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 1
Annex I – part VII – paragraph 2 – indent 1
- set up an asymmetric ex-ante mechanism to prevent (instead of merely remedy) unfair market behaviour by “"systemic platforms”" in the digital world, building on the Platform to Business Regulation; such mechanism should allow regulatory authorities to impose remedies on these companies with a significant market position in order to address market failures, without the establishment of a breach of regulatory rules;
Amendment 857 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 1
Annex I – part VII – paragraph 2 – indent 1
- set up an ex-ante mechanism to prevent (instead of merely remedy) unfair market behaviour by “systemic platforms” in the digital world, building on the Platform to Business Regulation; such mechanism should allow regulatory authorities to impose remediesanctions on these companies in order to address market failures, without the establishment of a breach of regulatory rules;
Amendment 866 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 2 a (new)
Annex I – part VII – paragraph 2 – indent 2 a (new)
- explore other ex-ante remedies that prevent the creation of new systemic platforms. In addition to reactive ex-ante mechanism, the Digital Services Act should envisage preventive mechanisms that prevent the creation of digital gatekeepers;
Amendment 878 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 5 a (new)
Annex I – part VII – paragraph 2 – indent 5 a (new)
- ensure that users of "systemic platforms" will be able to effectively control results of algorithms suggesting them specific content; users should be properly informed of all the reasons why specific content is suggested to them;
Amendment 883 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 6
Annex I – part VII – paragraph 2 – indent 6
- imposensure high levels of interoperability measures requiring “"systemic platforms”" to share appropriate tools, non-rivalrous data, data, expertise, and resources deployed in order to limit the risks of users and consumers’ lock-in and the artificially binding users to one systemic platform with no possibility or incentives for switching between digital platforms or internet ecosystems and to empower users in deciding what kind of content they want to see. As part of those measures, the Commission should explore different technologies and open standards and protocols, including the possibility of a mechanical interface (Application Programming Interface) that allows users of competing platforms to dock on to the systemic platform and exchange information with it.
Amendment 908 #
Motion for a resolution
Annex I – part VIII – paragraph 4
Annex I – part VIII – paragraph 4
The central regulator should coordinate the work of the different authorities dealing with illegal and harmful content online, enforce compliance, fines, and be able to carry out auditing of intermediaries and platforms.