22 Amendments of Peter LUNDGREN related to 2020/0361(COD)
Amendment 10 #
Proposal for a regulation
Recital 2
Recital 2
Amendment 26 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
Amendment 31 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court and in in-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 32 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner and have a long history of unpartisan behavior. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 34 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Users and material should never be deleted in an automatic way due to notices and complaints. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 41 #
Proposal for a regulation
Recital 53
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, especially the basic right to an account for all legal users, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
Amendment 54 #
Proposal for a regulation
Recital 68
Recital 68
Amendment 58 #
Proposal for a regulation
Recital 71
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commissionservice providers may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross- border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.
Amendment 61 #
Proposal for a regulation
Recital 88
Recital 88
Amendment 62 #
Proposal for a regulation
Recital 89
Recital 89
Amendment 63 #
Proposal for a regulation
Recital 90
Recital 90
Amendment 64 #
Proposal for a regulation
Recital 91
Recital 91
Amendment 65 #
Proposal for a regulation
Recital 92
Recital 92
Amendment 66 #
Proposal for a regulation
Recital 93
Recital 93
Amendment 67 #
Proposal for a regulation
Recital 95
Recital 95
Amendment 75 #
Proposal for a regulation
Article 1 – paragraph 5 – point e
Article 1 – paragraph 5 – point e
Amendment 203 #
Proposal for a regulation
Article 25 a (new)
Article 25 a (new)
Article 25 a The right to an account 1. Very large platforms shall have the obligation of providing the ability for every user with legal intentions to create an account. The user shall be able to verify its account. 2. Very large platforms shall only be able to delete an account if the user uses the account for illegal activities like abuse, fraud or terrorist propaganda.
Amendment 204 #
Proposal for a regulation
Article 25 b (new)
Article 25 b (new)
Article 25 b Content moderation 1. Only illegal content may be deleted on a very large platform by the provider. 2. Users may block and delete content on the section of platform of which it controls, for example comments on its posts.
Amendment 205 #
Proposal for a regulation
Article 25 c (new)
Article 25 c (new)
Article 25 c Shadow banning The practice of shadow banning which means that the user can still use and post on the very large platform but the spread is significantly reduced shall be prohibited.
Amendment 206 #
Proposal for a regulation
Article 25 d (new)
Article 25 d (new)
Article 25 d Algorithms and political views The very large platforms algorithm which selects content to be shown shall never disadvantage one side of the political debate.
Amendment 214 #
Proposal for a regulation
Article 35
Article 35
Amendment 222 #
Proposal for a regulation
Article 37
Article 37