12 Amendments of Anna-Michelle ASIMAKOPOULOU related to 2020/2022(INI)
Amendment 6 #
Draft opinion
Paragraph 1
Paragraph 1
1. Welcomes the Commission’s intention to introduce a harmonised approach addressing obligations imposed onfor intermediaries, in order to avoid fragmentation of the internal market; stresses that any measure related to fundamental rights should be carefully balanced and take into account the possible impact on the functioning of the internal market, and calls on the Commission to avoid the ‘export’ of national regulations and instead to propose th and to guarantee the respect of fundamental rights and freedoms of users across the Union in a uniform way ; stresses that any measure related to fundamental rights should be carefully balanced and ensure fair digital ecosystem; therefore calls on the Commission to propose the widest- possible harmonisation of rules on liability exemptions and content moderation in order to achieve most efficient and effective solutions for the internal market as a whole;
Amendment 7 #
Draft opinion
Paragraph 1
Paragraph 1
1. Welcomes the Commission’s intention to introduce a harmonised approach addressing obligations imposed onfor online intermediaries, in order to avoid fragmentation of the internal market while guaranteeing users fundamental rights; stresses that any measure related to fundamental rights should be carefully balanced and take into account the possible impact on the functioning of the internal market, and calls on the Commission to avoid the ‘export’ of national regulations and instead to propose the most efficient and effective solutions for the internal market as a whole without creating new administrative burdens and keeping the digital single market open and competitive;
Amendment 13 #
Draft opinion
Paragraph 2
Paragraph 2
2. States that limited liability provisions as set out in the e-Commerce Directive1 must be maintained and strengthened in the Digital Services Act, particularly adjusted in the Digital Services Act, so as to make sure that platforms that are actively hosting or moderating content bear more, proportionately to their capabilities, responsibility for illegal or harmful content; argues that besides the content, platforms should be responsible for the effects of their algorithms and their actions, including moderating order to protect freedom of expression and the freedom to provide service curating of the content, vis-à-vis consumers; highlights that these adjustments should pay particular attention to protect freedom of expression and the freedom to provide services; emphasises that this should be achieved without introducing any sort of general monitoring requirements; underlines the importance of these protections to the growth of European SMEs; _________________ 1 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’), OJ L 178, 17.7.2000, p. 1.
Amendment 17 #
Draft opinion
Paragraph 2
Paragraph 2
2. States that limited liability provisions as set out in the e-Commerce Directive1 must be maintained and strengthenwhere needed, updated in the Digital Services Act to better protect users and consumers, particularly in order to protect freedom of expression and the freedom to provide services; underlines the importance of these protections to the growth of European SMEscompanies, SMEs and microbusinesses in particular; _________________ 1 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’), OJ L 178, 17.7.2000, p. 1.
Amendment 20 #
Draft opinion
Paragraph 2 a (new)
Paragraph 2 a (new)
2a. Calls on the Commission to introduce provisions protecting consumers from harmful microtargeting; in this respect, believes that specific limitations, i.e. of microtargeting based on characteristics exposing physical or psychological vulnerabilities, transparency obligations in regard to algorithms used by platforms and adequate tools empowering users to enforce fundamental rights online, are necessary in order to protect consumer rights effectively;
Amendment 24 #
Draft opinion
Paragraph 3
Paragraph 3
3. Recognises that SMEmicrocompanies and large players have differing capabilities with regard to the moderation of content; warns that overburdening businesses with disproportionate new obligations could further hinder the growth of SMEs and require recourse to automatic filtering tools, which may often lead to the removal of legal contentin this regard, calls the Commission to make sure that the legislative proposal will keep the digital single market open and competitive and to pay special attention to the specific needs of microcompanies;
Amendment 31 #
Draft opinion
Paragraph 3
Paragraph 3
3. Recognises that online intermediaries, including microcompanies, SMEs and large players have differing capabilities with regard to the moderation of content; warns that overburdening businesses with disproportionate new obligations could further hinder the growth of SMEs and require recourse to automatic filtering tools, which may often lead to the removal of legal content;
Amendment 36 #
Draft opinion
Paragraph 4
Paragraph 4
4. Notes theat there are often significant differences between digital services and; therefore calls for the avoidance of a one-size-fits-all approach, where differentiation of instruments is needed;
Amendment 39 #
Draft opinion
Paragraph 5
Paragraph 5
5. Recalls the fact that mdisinformativeon and harmful content is not always illegal in every Member State; calls, therefore, for the establishment of a well-defined notice- and-takedown process; supports an intensive dialogue between authorities and relevant stakeholders with the aim of deepening the soft law approach based on good practices such as the EU-wide Code of Practice on Disinformation,action mechanism; Believes that such a process adding requirements for platforms to take any measures regarding the content they host must take note of the significant differences between digital services and be proportionate to their scale of reach and operational capacities so as to avoid unnecessary regulatory burdens for the platforms and any restrictions of fundamental rights as a result, such as any restriction on the freedom of expression; Supports greater cooperation between Member States, competent authorities and platforms with the aim of developing and improving soft law approaches in order to further tackle mdisinformation;
Amendment 40 #
Draft opinion
Paragraph 5
Paragraph 5
5. Recalls the fact that mdisinformativeon and harmful content, yet detrimental to society, is not always illegal in all Member States; calls, therefore, for the establishment of a well-defined notice- and-takedowaction process; supports and intensive dialogue between authorities and relevant stakeholders with the aim of deepening the soft law approach based on good practroducing obligations for business operators to take proactive measures proportionate to their scale of reach and operational capacities to address the appearance of illegal and harmful content, including disinformation, on their services; such as the EU-wide Code of Practice on Disinformation,pports an intensive dialogue between authorities and relevant stakeholders in order to further tackle mdisinformation;
Amendment 52 #
Draft opinion
Paragraph 6
Paragraph 6
6. Calls for the introduction of counter-notice tools to allowproper safeguards and due process requirements, including human oversight and verification of all complaints, as well as counter-notice tools in order to ensure that removal decisions are accurate and content owners tocan defend their rights adequately and in a timely manner when notified of any takedown; underlines its view that delegating the responsibility to set boundaries on freedom of speech solely to private companies is unacceptable and creates risks for both citizens and businesses, neither of which are qualified to take such decisionsfor this reason removal of content should be followed-up by law enforcement.
Amendment 55 #
Draft opinion
Paragraph 6
Paragraph 6
6. Calls for the introduction of appropriate safeguards, due process obligations and counter- notice toolprocedures to allow content owners to defend their rights adequately and in a timely manner when notified of any takedown; underlines its view that delegating the responsibility to set boundaries on freedom of speech to private companies is unacceptable and creates risks for both citizens and businesses, neither of which are qualified to take such decisions.; Believes that the removal of content should be followed up by law enforcement where needed;