Activities of Robert ROOS related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (continuation of debate)
Shadow opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Amendments (77)
Amendment 118 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, nNew technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 126 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 129 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
Amendment 132 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling manifestly illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be manifestly illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 133 #
Proposal for a regulation
Recital 40 a (new)
Recital 40 a (new)
(40 a) It is worth noting that leaking information can constitute as a criminal offence or is in most cases linked to a criminal offence. Not a single provision in this regulation can be explained as an obligation by default to take leaked information offline. This does not mean that leaked information under no circumstance can qualify as manifestly illegal content. Leaked information that is in the interest of the public debate shall in principle not be deemed as illegal content. However, if the leaked information in itself not contains any indication of illegal activities, the leaked information shall be deemed manifestly illegal and subsequently be removed.
Amendment 134 #
Proposal for a regulation
Recital 41
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content. While an absolute hierarchy between these rights does not exist, extraordinary gravity will be assigned to the freedom of expression as this right is a cornerstone in a democratic society.
Amendment 143 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise, with a focus on (the limitations to) the freedom of expression, to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 145 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court and in-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 149 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner and have a long history of unpartisan behaviour. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 172 #
Proposal for a regulation
Recital 53
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result. ensure that very large online platforms fulfil the aforementioned roles to the fullest extent and do not limit the public debate, or silence dissenting opinions, there being no alternative and less restrictive measures that would effectively achieve the same result. In general, everyone shall have the right to be on a very large online platform. Only in very exceptional cases, one can be permanently denied access to a very large online platform. These exceptional cases in cases where the recipient repeatedly disseminates of manifest illegal content that violates the public order, or the public health. The decision to permanently ban a recipient should always be able to be revoked by a competent court in accordance with the law of the Member States.
Amendment 173 #
Proposal for a regulation
Recital 53
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, especially the basic right to an account for all legal users, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
Amendment 176 #
Proposal for a regulation
Recital 56
Recital 56
Amendment 180 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. When it comes to hate speech, it must be underlined that it is nearly impossible for online platforms to asses whether hate speech constitutes as illegal hate speech, or that it is protected by the freedom of expression. For example: an expression done in the context of the public debate, in the context of a religion, an expression made by a comedian or a politician will in almost all of the occasions be protected by the freedom of expression according to the European Court of Human Rights. It is therefore not up to online platforms to determine whether an expression constitutes as illegal hate speech, but up to judges. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non- discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 183 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also includnot impose corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources as long as the content is not deemed manifestly illegal. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding the freedom of expression, public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 192 #
Proposal for a regulation
Recital 68
Recital 68
Amendment 226 #
Proposal for a regulation
Article 2 – paragraph 1 – point f a (new)
Article 2 – paragraph 1 – point f a (new)
(f a) ‘allegedly illegal content’ is content being subject to allegations of illegality.
Amendment 227 #
Proposal for a regulation
Article 2 – paragraph 1 – point f b (new)
Article 2 – paragraph 1 – point f b (new)
(f b) ‘manifestly illegal content’ is content that is unmistakably illegal. The illegal character of this content shall be instantly manifest to every average citizen without a legal background.
Amendment 228 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘proven illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that lawis all content a competent judicial body has deemed illegal;
Amendment 237 #
Proposal for a regulation
Article 2 – paragraph 1 – point l
Article 2 – paragraph 1 – point l
(l) ‘Digital Services Coordinator of establishment’ means the Digital Services Coordinator of the Member State where the provider of an intermediary service is established or, in the case that the intermediary service is not established inside the European Union, its legal representative resides or is established;
Amendment 242 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditionsallegedly, manifestly, or proven illegal content or content incompatible with their terms and conditions to the extent the intermediary service is allowed to moderate this content under their terms and conditions in accordance with Article 12 and 25a of this regulation, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
Amendment 255 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
Article 5 – paragraph 1 – point a
(a) does not have actual knowledge of illegal activity orthe manifestly illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity ormanifestly illegal content is apparent; or
Amendment 256 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the manifestly illegal content.; or
Amendment 257 #
Proposal for a regulation
Article 5 – paragraph 1 – point b a (new)
Article 5 – paragraph 1 – point b a (new)
(b a) upon obtaining knowledge or awareness of an order from a competent judicial body, acts expeditiously to remove or to disable access to the proven illegal content.
Amendment 265 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, manifestly illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.
Amendment 273 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of proven illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 275 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
Article 8 – paragraph 2 – point a – indent 1
— a statement of reasons explaining why the information is illegal contentcontent was proven illegal, by reference to the specific provision of Union or national law infringed;
Amendment 282 #
Proposal for a regulation
Article 11 – paragraph 3
Article 11 – paragraph 3
Amendment 320 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be manifestly illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means.
Amendment 322 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify twhe illegality ofther the content in question qualifies as manifestly illegal. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 325 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be manifestly illegal content;
Amendment 326 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the allegedly manifestly illegal content;
Amendment 355 #
Proposal for a regulation
Article 15 – paragraph 2 – point d
Article 15 – paragraph 2 – point d
(d) where the decision concerns allegedly manifestly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be manifestly illegal content on that ground;
Amendment 363 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15 a Market entrance protection The provisions in this section shall not be enforced on new established entities for a period of one year after their establishment. During this period new established entities shall make any reasonable efforts to comply with the provisions in this section and act in good faith.
Amendment 366 #
Proposal for a regulation
Article 16 – title
Article 16 – title
Exclusion for micro and small enterpriseSMEs
Amendment 371 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterpriseSMEs within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 376 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is manifestly illegal content or incompatible with its terms and conditions:
Amendment 384 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not manifestly illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 385 #
Proposal for a regulation
Article 17 – paragraph 3 a (new)
Article 17 – paragraph 3 a (new)
3 a. If a decision is reversed under paragraph 3 the online platform shall compensate the recipient with an amount of 25 EUR or 50 EUR if the online platform qualifies as a VLOP. This is without prejudice to the recipients right to seek compensation for his real damages.
Amendment 392 #
Proposal for a regulation
Article 18 – paragraph 2 – point b
Article 18 – paragraph 2 – point b
(b) it has the necessary expertise in relation toconcerning the issues arising in one or more particular areas of illegal content, or in relation toof illegal content, in particular as it comes to the applicable laws relating to freedom of expression and its limitations and the applicable case law, including the case- law of the European Court of Human Rights, or about the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute;
Amendment 396 #
Proposal for a regulation
Article 18 – paragraph 3 a (new)
Article 18 – paragraph 3 a (new)
3 a. If the online platform concerned constitutes a very large online platform the maximum fee for an out-of-court settlement procedure shall not exceed 25 EUR for a consumer. If the decision is in favor of the recipient, the very large online platform shall reimburse any fee paid to the body for the dispute settlement. The recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay concerning the dispute settlement. On top of that, the very large online platform shall compensate the recipient with a minimal amount of 100 EUR. This is without prejudice to the recipient's right to seek compensation for his real damages.
Amendment 404 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying manifestly illegal content;
Amendment 405 #
Proposal for a regulation
Article 19 – paragraph 2 – point a a (new)
Article 19 – paragraph 2 – point a a (new)
(a a) it has sufficient legal expertise as it comes to the law regarding freedom of expression and its limitations including the applicable case law of the European Court of Human Rights;
Amendment 419 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shallmay suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal contentproven illegal content, taking into account the severity of the illegal content disseminated via the online platform.
Amendment 423 #
Proposal for a regulation
Article 20 – paragraph 1 a (new)
Article 20 – paragraph 1 a (new)
1 a. Online platforms shall without any delay terminate the provision of their services to recipients disseminating manifestly illegal content, given that the manifestly illegal content also constitutes as a crime against the public order, the public morals or the public health.
Amendment 424 #
Proposal for a regulation
Article 20 – paragraph 1 b (new)
Article 20 – paragraph 1 b (new)
1 b. Online platforms may suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that repeatedly provide manifestly illegal content that not constitutes as an assault to the public order, the public health, or the public morals. Online platforms shall take into account the severity of the illegal content disseminated via the online platform.
Amendment 435 #
Proposal for a regulation
Article 20 – paragraph 4 a (new)
Article 20 – paragraph 4 a (new)
4 a. If the recipients account is suspended and the online platform constitutes as a very large online platform, the Freedom of Expression Officer is notified about the suspension immediately.
Amendment 476 #
Proposal for a regulation
Article 22 – paragraph 7 a (new)
Article 22 – paragraph 7 a (new)
7 a. The online platforms shall ensure that traders are swiftly approved, given that they provided the necessary information correctly;
Amendment 480 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
Amendment 518 #
Proposal for a regulation
Article 25 a (new)
Article 25 a (new)
Article 25 a Terms and conditions of very large online platforms The terms and conditions of very large online platforms shall not provide additional conditions defining what content is allowed on their very large online platform. These boundaries are prescribed by applicable Union and national law. The terms and conditions of very large online platforms shall not have any adverse effects on the fundamental rights as enshrined in the EU Charter on fundamental rights, especially not on the fundamental right on freedom of expression, in accordance with the applicable law of the Member States and the applicable Union law.
Amendment 520 #
Proposal for a regulation
Article 25 b (new)
Article 25 b (new)
Article 25 b Equal treatment of legal content The algorithms of very large online platforms will not assess the intrinsic character of the content disseminated through their platform. Furthermore, very large online platforms are not allowed to take corrective measures on legal content, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources.
Amendment 522 #
Proposal for a regulation
Article 25 c (new)
Article 25 c (new)
Article 25 c Shadow banning The practice of shadow banning, which means that the user still can use and post on the very large online platform but the spread is significantly reduced, shall be prohibited.
Amendment 523 #
Proposal for a regulation
Article 25 d (new)
Article 25 d (new)
Article 25 d Algorithms and political views The very large online platform’s algorithm, which selects content to be shown, shall never favour or disadvantage particular political views.
Amendment 530 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
(a) the dissemination of illegal content through their services; the very large online platforms shall make a distinction between allegedly, manifestly illegal content, and proven illegal content;
Amendment 532 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively, with particular regard to the freedom of expression;
Amendment 546 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of manifestly illegal content and of information that is incompatible with their terms and conditions.
Amendment 561 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
Article 27 – paragraph 1 – point e
Amendment 573 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the followingobligations as set out in Chapter III:
Amendment 576 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
Amendment 578 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
Amendment 596 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Online platforms shall not make the recipients of their services subject to recommender system based on profiling, unless the recipient of the service has expressed a freely given, specific, informed and unambiguous consent. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to (1) modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679 or (2) see all information without manipulation.
Amendment 640 #
Proposal for a regulation
Article 32 a (new)
Article 32 a (new)
Article 32 a The Freedom of Expression Officer 1. Very large online platforms shall appoint one or more Freedom of Expression officers. These officers shall continuously assess whether the content removed from the very large online platform concerned is, in fact, illegal content. 2. The Freedom of Expression Officer shall have the professional qualifications, knowledge and experience necessary to assess whether the content is in accordance with the applicable laws, or that it should be deemed illegal. 3. Freedom of Expression Officers shall have the following tasks: (a) Continuously and, if need be, randomly assess whether deleted content was indeed illegal, or whether the content was within the boundaries of the law. (b) In the case that the Freedom of Expression Officer finds that content was not illegal and should not have been removed, the Freedom of Expression Officer shall notify the relevant departments in order to ensure that the content is replaced. Subsequently, the Freedom of Expression Officer shall draw up an accessible report on why the content should not be deemed illegal and why it subsequently should not have been removed. The very large online platform will publish this report as soon as possible on its designated website. (c) In the case that leaked information is disseminated via a very large online platform, the Freedom of Expression Officer may give a public recommendation as to whether the leaked information should be deemed as manifestly illegal content and the content should be removed from the platform, or that the content is in the interest of the public debate and subsequently is protected by the freedom of expression. The Freedom of Expression Officer will make its recommendation publicly available as soon as possible. 4. The Freedom of Expression Officer shall under no circumstance seek approval from any body within the very large online platform before he publishes a report or a recommendation. He shall be permitted to exercise his duties completely independently. 5. The Freedom of Expression Officer shall be equipped with sufficient staff. A minimum of 0.5 % of the employed staff within the very large online platform shall be designated to the Bureau of the Freedom of Expression Officer. 6. A very large online platform may be exempted from the requirement set out in paragraph 5. In order to secure this exemption, the very large online platform shall submit a request to the competent Digital Services Coordinator. The Digital Services Coordinator shall grant this request if the decisions made under Article 17 and Article 18 are in 90 % of the cases in favour of the very large online platform.
Amendment 644 #
Proposal for a regulation
Article 33 – paragraph 3
Article 33 – paragraph 3
3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complett shall however indicate that information was removed from the reports to, the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removingscope of the information removed and for what reason the information was removed from the public reports.
Amendment 646 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33 a Algorithm accountability The very large online platform shall provide the Commission and interested Member States under the condition of confidentiality with all information necessary to perform an audit of the algorithms used in order to verify how algorithms influence social and political debate and how they impact fundamental rights. When performing their audit, the Commission and interested Member States may seek advice from external researchers. The information necessary to perform the audit shall remain confidential, the audit shall be published.
Amendment 650 #
Proposal for a regulation
Article 35
Article 35
Amendment 651 #
Proposal for a regulation
Article 35
Article 35
Amendment 656 #
Proposal for a regulation
Article 36
Article 36
Amendment 659 #
Proposal for a regulation
Article 37
Article 37
Amendment 660 #
Proposal for a regulation
Article 37
Article 37
Amendment 668 #
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 2
Article 38 – paragraph 3 – subparagraph 2
Member States shall make publicly available, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted. The Commission, as well, shall publish and update a register containing the name and contact information of the Digital Service Coordinators responsible in each Member State;
Amendment 676 #
Proposal for a regulation
Article 41 – paragraph 3 – subparagraph 3 – point b
Article 41 – paragraph 3 – subparagraph 3 – point b
(b) the temporary restriction does not unduly restrict access to lawful information by recipients of the service, having regard to the number of recipients affected and whether any adequate and readily accessible alternatives exist.
Amendment 678 #
Proposal for a regulation
Article 41 – paragraph 6
Article 41 – paragraph 6
6. Member States shall ensure that any exercise of the powers pursuant to paragraphs 1, 2 and 3 is subject to adequatethe highest safeguards laid down in the applicable national law inand in absolute conformity with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties.
Amendment 686 #
Proposal for a regulation
Article 42 – paragraph 4 a (new)
Article 42 – paragraph 4 a (new)
4 a. If the providers of intermediary services qualifies as an SME all of the above mentioned percentages will be divided by three.
Amendment 688 #
Proposal for a regulation
Article 43 – paragraph 1
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. Complaints shall, to the extent possible, be made public.
Amendment 707 #
Proposal for a regulation
Article 64 – paragraph 1
Article 64 – paragraph 1
1. The Commission shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of, the content of the decision and all the documents or other forms of information on which the decision is based, including any penalties imposed.
Amendment 710 #
Proposal for a regulation
Article 67 – paragraph 3 a (new)
Article 67 – paragraph 3 a (new)
3 a. Information stored in the information sharing system shall fall under the scope of article 15 of the Treaty on the functioning of the European Union and article 42 of the EU Charter on fundamental rights. This provision is without prejudice to regulation 1049/2001.
Amendment 711 #
Proposal for a regulation
Article 68 – paragraph 1 – point a a (new)
Article 68 – paragraph 1 – point a a (new)
(a a) it has sufficient legal expertise as it comes to the law regarding freedom of speech and its limitations including the applicable case-law of the European Court of Human Rights;
Amendment 714 #
Proposal for a regulation
Article 73 – paragraph 1 a (new)
Article 73 – paragraph 1 a (new)
1 a. The evaluation shall pay specific attention to the position of SMEs and the position of new competitors.