Activities of Patrick BREYER related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (debate)
Opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Shadow opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Amendments (170)
Amendment 113 #
Amendment 118 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadappropriately and also covers information relating to illegal content, products, services and activities where such information is itself illegal. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relatesfers in an illegal manner to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 136 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to meanAccordingly, where access to information requires registration or admittance to a group of users, that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined personshould be considered to have been disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision or selection on whom to grant access. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulationare not considered to have been disseminated to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 137 #
Proposal for a regulation
Recital 15 a (new)
Recital 15 a (new)
(15 a) The online activities of a person allow for deep insights into their personality as well as their past and future behaviour, making it possible to manipulate them. The high sensitivity of such information and its potential for abuse requires special protection. In line with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data, recipients should have the right to use and pay for information society services anonymously wherever technically possible. Anonymous payment can take place for example by paying in cash, by using cash- paid vouchers or prepaid payment instruments. The general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy. Users should therefore have a right not to be subject to pervasive tracking when using information society services. To this end, the processing of personal data concerning the use of services should be limited to the extent strictly necessary to provide the service and to bill the users. Processing personal data for displaying advertisements is not strictly necessary. Following the jurisprudence on communications meta- data providers should not be required to indiscriminately retain personal data concerning the use of the service by all recipients. Applying effective end-to-end encryption to data is essential for trust in and security on the Internet, and effectively prevents unauthorised third party access. The fact that encryption technology is abused by some for illegal purposes does not justify generally weakening effective end-to-end encryption.
Amendment 143 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it the provider of intermediary services has knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. The exemptions from liability established by this Regulation should not depend on uncertain notions such as an ‘active’, ‘neutral’ or ‘passive’ role of providers.
Amendment 150 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegalafter becoming aware of the unlawful nature of content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 151 #
Proposal for a regulation
Recital 15 a (new)
Recital 15 a (new)
(15 a) The online activities of a person allow for deep insights into their personality as well as their past and future behaviour, making it possible to manipulate them. The high sensitivity of such information and its potential for abuse requires special protection. In line with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data, recipients should have the right to use and pay for information society services anonymously wherever technically possible. Anonymous payment can take place for example by paying in cash, by using cash- paid vouchers or prepaid payment instruments. The general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy. Users should therefore have a right not to be subject to pervasive tracking when using information society services. To this end, the processing of personal data concerning the use of services should be limited to the extent strictly necessary to provide the service and to bill the users. Processing personal data for displaying advertisements is not strictly necessary. Following the jurisprudence on communications meta- data providers should not be required to indiscriminately retain personal data concerning the use of the service by all recipients. Applying effective end-to-end encryption to data is essential for trust in and security on the Internet, and effectively prevents unauthorised third party access. The fact that encryption technology is abused by some for illegal purposes does not justify generally weakening effective end-to-end encryption.
Amendment 155 #
Amendment 170 #
Proposal for a regulation
Recital 28 a (new)
Recital 28 a (new)
(28 a) Providers of intermediary services should not be obliged to use automated tools for content moderation because such tools are incapable of effectively understanding the subtlety of context and meaning in human communication, which is necessary to determine whether assessed content violates the law or terms of service. Human review of automated reports by service providers or their contractors does not fully solve this problem, especially if it is outsourced to staff of private enterprises who lack sufficient independence, qualification and accountability.
Amendment 171 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders. The applicable rules on the mutual recognition of court decisions should be unaffected.
Amendment 176 #
Proposal for a regulation
Recital 30 a (new)
Recital 30 a (new)
(30 a) In order to avoid conflicting interpretations of what constitutes illegal content and to ensure the accessibility of information that is legal in the Member State in which the provider is established, orders to act against illegal content should in principle be issued by judicial authorities of the Member State in which the provider has its main establishment, or, if not established in the Union, its legal representative. The judicial authorities of other Member States should be able to issue orders the effect of which is limited to the territory of that Member State. A special regime should apply to acting against unlawful commercial offers of goods and services.
Amendment 177 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. Since intermediaries should not be required to remove information which is legal in their country of origin, Union authorities should be able to order the blocking of content legally published outside the Union only for the territory of the Union where Union law is infringed and for the territory of the issuing Member State where national law is infringed.
Amendment 186 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable or restrict proposals by recommender systems of information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. The restriction of proposals by recommender systems can take place, for example, by practices of ‘shadow-banning’ content.
Amendment 192 #
Proposal for a regulation
Recital 42 a (new)
Recital 42 a (new)
(42 a) When moderating content, mechanisms voluntarily employed by platforms should not lead to ex-ante control measures based on automated tools or upload-filtering of content. Automated tools are currently unable to differentiate illegal content from content that is legal in a given context and therefore routinely result in overblocking legal content. Human review of automated reports by service providers or their contractors does not fully solve this problem, especially if it is outsourced to private staff that lack sufficient independence, qualification and accountability. Ex-ante control should be understood to mean making publishing subject to an automated decision. Filtering automated content submissions such as spam should be permitted. Where automated tools are otherwise used for content moderation the provider should ensure human review and the protection of legal content.
Amendment 209 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable or restrict proposals by recommender systems of information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. The restriction of proposals by recommender systems can take place, for example, by practices of ‘shadow-banning’ content.
Amendment 214 #
Proposal for a regulation
Recital 42 a (new)
Recital 42 a (new)
(42 a) When moderating content, mechanisms voluntarily employed by platforms should not lead to ex-ante control measures based on automated tools or upload-filtering of content. Automated tools are currently unable to differentiate illegal content from content that is legal in a given context and therefore routinely result in overblocking legal content. Human review of automated reports by service providers or their contractors does not fully solve this problem, especially if it is outsourced to private staff that lack sufficient independence, qualification and accountability. Ex-ante control should be understood to mean making publishing subject to an automated decision. Filtering automated content submissions such as spam should be permitted. Where automated tools are otherwise used for content moderation the provider should ensure human review and the protection of legal content.
Amendment 223 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Trusted flaggers should also have the possibility to submit notices of incorrect removal, disabling access to or restricting proposals by recommender systems of content or of suspensions or terminations of accounts. Digital Service Coordinators may award the status of trusted flaggers to entities solely for this purpose. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 229 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and nNotices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 234 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of personf person is imminent, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevantthe information available to it, including where relevant the content in question and an explanation of itsthat has given rise to suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 244 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65 a) Minimum interoperability requirements for very large online platforms can create new opportunities for the development of innovative services, overcome the lock-in effect of closed platforms and ensure competition and user choice. These requirements should allow for cross-platform interaction by recipients. Very large online platforms should provide an application programming interface through which third-party platforms and their recipients can interoperate with the main functionalities and recipients of the platform. Among the main functionalities can be the ability to recieve information from certain accounts, to share provided content and react to it. The interoperability requirements do not prevent platforms from offering additional and new functions to their recipients.
Amendment 252 #
Proposal for a regulation
Recital 53
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concernsillegal content, there being no alternative and less restrictive measures that would effectively achieve the same result.
Amendment 259 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of manifestly illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of manifestly illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is manifestly illegal content or incompatible with an online platform’s terms and conditions.
Amendment 262 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment where mitigation is possible without adversely impacting fundamental rights. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. The decision as to the choice of measures should remain with the platform. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due accwithount of potential negative effects onadversely impacting the fundamental rights of the recipients of the service.
Amendment 267 #
Proposal for a regulation
Recital 59
Recital 59
(59) Very large online platforms should, where appropriate, conduct their riskimpact assessments and design their risk mitigation measures related to any adverse impact with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations.
Amendment 275 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient, and that those options are used by default.
Amendment 286 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the rdisks and possible harms brought about bysemination of illegal content using the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint- handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 289 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65 a) Minimum interoperability requirements for very large online platforms can create new opportunities for the development of innovative services, overcome the lock-in effect of closed platforms and ensure competition and user choice. These requirements should allow for cross-platform interaction by recipients. Very large online platforms should provide an application programming interface through which third-party platforms and their recipients can interoperate with the main functionalities and recipients of the platform. Among the main functionalities can be the ability to receive information from certain accounts, to share provided content and react to it. The interoperability requirements do not prevent platforms from offering additional and new functions to their recipients.
Amendment 295 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Article 2 a Targeting of digital advertising 1. Providers of information society services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of determining the recipients to whom advertisements are displayed. 2. This provision shall not prevent information society services from determining the recipients to whom advertisements are displayed on the basis of contextual information such as keywords, the language setting communicated by the device of the recipient or the geographical region of the recipients to whom an advertisement is displayed. 3. The use of the contextual information referred to in paragraph2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of one or more natural persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or persons.
Amendment 296 #
Proposal for a regulation
Recital 67
Recital 67
(67) The Commission and the Board shouldmay encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
Amendment 298 #
Proposal for a regulation
Recital 68
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content shouldmay be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
Amendment 302 #
Proposal for a regulation
Recital 69
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan.
Amendment 308 #
Proposal for a regulation
Recital 71
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross- border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms shouldmay be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.
Amendment 309 #
Proposal for a regulation
Recital 71 a (new)
Recital 71 a (new)
(71 a) "Soft law" instruments such as codes of conduct and crisis protocols may pose a risk to fundamental rights because, unlike legislation, they are not subject to democratic scrutiny and their compliance with fundamental rights is not subject to judicial review. In order to enhance accountability, participation and transparency, procedural safeguards for drawing up codes of conduct and crisis protocols are needed.
Amendment 357 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
Article 8 – paragraph 4 a (new)
4 a. Member States shall ensure that the judicial authorities may, at the request of an applicant whose rights are infringed by the accessibility of illegal content, issue against the relevant provider of hosting services an order in accordance with this Article to remove or disable access to this content, including by way of an interlocutory injunction.
Amendment 359 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
Article 1 – paragraph 5 – point c
Amendment 404 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(q a) ‘dark pattern’ means an online interface or a part thereof that via its structure, design or functionality subverts or impairs the autonomy, decision- making, preferences or choice of recipients of the service.
Amendment 411 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Amendment 412 #
Proposal for a regulation
Article 2 b (new)
Article 2 b (new)
Amendment 414 #
Proposal for a regulation
Article 3 – paragraph 3
Article 3 – paragraph 3
Amendment 416 #
Proposal for a regulation
Article 4 – paragraph 2
Article 4 – paragraph 2
2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 429 #
Proposal for a regulation
Article 5 – paragraph 4
Article 5 – paragraph 4
4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 431 #
Proposal for a regulation
Article 6
Article 6
Amendment 439 #
Proposal for a regulation
Article 7 – title
Article 7 – title
No general monitoring or, active fact- finding or automated content moderation obligations
Amendment 442 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
Providers of intermediary services shall not be obliged to use automated tools for content moderation.
Amendment 444 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt via a secure communications channel of an order to act against a specific item of illegal content, issued by the relevanta national judicial or administrative authoritiesy, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was take. This rule shall apply mutatis mutandis in respect of competent administrative authorities ordering online platforms to act against traders unlawfully promoting or offering products or services in the Union.
Amendment 456 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
Article 8 – paragraph 2 – point a – indent 3
— information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content;
Amendment 458 #
Proposal for a regulation
Article 8 – paragraph 2 – point b
Article 8 – paragraph 2 – point b
(b) the territorial scope of an order addressed to a provider that has its main establishment in the Member State issuing the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective;
Amendment 461 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
Article 8 – paragraph 2 – point b a (new)
(b a) the territorial scope of an order addressed to a provider that has its main establishment in another Member State is limited to the territory of the Member State issuing the order;
Amendment 462 #
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
Article 8 – paragraph 2 – point b b (new)
(b b) if addressed to a provider that has its main establishment outside the Union, the territorial scope of the order is limited to the territory of the Member State issuing the order;
Amendment 463 #
Proposal for a regulation
Article 8 – paragraph 2 – point b c (new)
Article 8 – paragraph 2 – point b c (new)
(b c) Points (b a) and (b b) shall not apply where online platforms are ordered to act against traders established in the same Member State as the issuing authority that are unlawfully promoting or offering products or services in the Union .
Amendment 465 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to or restrict proposals by recommender systems of specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decisor restricting proposals of that information, it shall inform the recipient, where he or she provided contact details, at the latest at the time of the removal or disabling of access or the restricting of proposals, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 470 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to or the restricting of proposals by recommender systems of, the information and, where relevant, the territorial scope of the disabling of access or the restricting of proposals;
Amendment 473 #
Proposal for a regulation
Article 8 – paragraph 3
Article 8 – paragraph 3
3. The Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the orders referred to in paragraph 1 to all other Digital Services Coordinators through the system established in accordance with Article 67.
Amendment 476 #
Proposal for a regulation
Article 8 – paragraph 3 a (new)
Article 8 – paragraph 3 a (new)
3 a. Digital Services Coordinators shall publish a ‘toolbox’ of complaint and redress mechanisms applicable in their respective territory, in at least one of the official languages of the Member State where they operate.
Amendment 481 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
Article 8 – paragraph 4 a (new)
4 a. The Commission shall adopt implementing acts pursuant to Article 291 TFEU, laying down a European technical standard for secure communication channels. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70.
Amendment 483 #
Proposal for a regulation
Article 8 – paragraph 4 b (new)
Article 8 – paragraph 4 b (new)
4 b. Member States shall ensure that the judicial authorities may, at the request of an applicant whose rights are infringed by the accessibility of illegal content, issue against the relevant provider of hosting services an order in accordance with this Article to remove or disable access to this content, including by way of an interlocutory injunction.
Amendment 486 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt via a secure communications channel of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevanta national judicial or administrative authoritiesy on the basis of the applicable Union or national law, in conformity with Union law, for the purpose of preventing serious threats to public security inform without undue delay the authority of issuing the order of its receipt and the effect given to the order via a secure communications channel.
Amendment 492 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or disable access to or restrict proposals by recommender systems of the information;
Amendment 493 #
Proposal for a regulation
Article 9 – paragraph 2 – point -a (new)
Article 9 – paragraph 2 – point -a (new)
(-a) the order is issued for the purpose of preventing serious threats to public security;
Amendment 494 #
Proposal for a regulation
Article 9 – paragraph 2 – point -a a (new)
Article 9 – paragraph 2 – point -a a (new)
(-a a) the order seeks information on a suspect or suspects of a serious threat to public security;
Amendment 497 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal–setting out why the measure is necessary and proportional, taking due account of the impact of the measure on the fundamental rights of the specific recipient of the service whose data is sought and the seriousness of the offences;
Amendment 500 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
Article 9 – paragraph 2 – point a – indent 1 a (new)
- a unique identifier of the recipients on whom information is sought;
Amendment 502 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
Article 9 – paragraph 2 – point a – indent 2
— information about redress mechanisms available to the provider and to the recipients of the service concerned;
Amendment 504 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objectivenon-arbitrary manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not manifestly illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 504 #
Proposal for a regulation
Article 9 – paragraph 2 – point b
Article 9 – paragraph 2 – point b
(b) the order only requires the provider to provide information already legally collected for the purposes of providing the service and which lies within its control;
Amendment 509 #
Proposal for a regulation
Article 9 – paragraph 3
Article 9 – paragraph 3
3. The Digital Services Coordinator from the Member State of the national judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the order referred to in paragraph 1 to all Digital Services Coordinators through the system established in accordance with Article 67.
Amendment 510 #
Proposal for a regulation
Article 9 – paragraph 4
Article 9 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union laprovider shall inform, without undue delay, the recipient whose data is being sought without undue delay. As long as this is necessary and proportionate, and is in order to protect the fundamental rights of another person, the issuing judicial authority, taking due account the impact of the measure on the fundamental rights of the person whose data is sought, may request the provider to delay informing the recipient. Such a request shall be duly justified, specify the duration of the obligation of confidentiality and be subject to periodic review.
Amendment 515 #
Proposal for a regulation
Article 9 – paragraph 4 a (new)
Article 9 – paragraph 4 a (new)
4 a. This Article shall apply, mutatis mutandis, in respect of competent administrative authorities ordering online platforms to provide the information listed in Article 22.
Amendment 516 #
Proposal for a regulation
Article 9 – paragraph 4 b (new)
Article 9 – paragraph 4 b (new)
4 b. Where information is sought for the purpose of criminal proceedings, Regulation (EU) 2021/XXXX on access to electronic evidence shall apply.
Amendment 517 #
Proposal for a regulation
Article 9 – paragraph 4 c (new)
Article 9 – paragraph 4 c (new)
4 c. Providers of intermediary services shall transfer personal data on recipients of their service requested by public authorities only where the conditions set out in this Article are met.
Amendment 518 #
Proposal for a regulation
Article 9 – paragraph 4 d (new)
Article 9 – paragraph 4 d (new)
4 d. The Commission shall adopt implementing acts pursuant to Article 291 of the Treaty on the Functioning of the European Union(TFEU), establishing a common European information exchange system with secure channels for the handling of authorised cross-border communications, authentication and transmission of the order referred to in paragraph 1 and, where applicable, of the requested data between the competent judicial authority and the provider. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70.
Amendment 520 #
Proposal for a regulation
Article -10 (new)
Article -10 (new)
Article -10 Exclusion for micro enterprises and not- for-profit services This Chapter shall not apply to online platforms that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC or as a not-for-profit service with fewer than 100.000 monthly active users.
Amendment 531 #
Proposal for a regulation
Article 18 a (new)
Article 18 a (new)
Article 18 a Judicial redress Member States shall ensure that the judicial authorities may, at the request of a recipient who is subject to the decision of an online platform to (a) remove or disable access to or restrict proposals by recommender systems of information provided by the recipient; (b) suspend or terminate the provision of the service, in whole or in part, to the recipient; (c) suspend or terminate the recipients’ account, review the legality of this decision and, where appropriate, issue an interlocutory injunction.
Amendment 536 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible formatand machine-readable format in the languages in which the service is offered.
Amendment 541 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall publish summary versions of their terms and conditions in clear, user- friendly and unambiguous language, and in an easily accessible and machine- readable format. Such a summary shall include information on remedies and redress mechanisms pursuant to Articles 17 and 18, where available. By 31 December 2024, the Commission shall, after consulting the Board and stakeholders, adopt implementing acts specifying a contract summary template to be used by the providers to fulfil their obligations under this article.
Amendment 543 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objectivefair, transparent, coherent, predictable, non-discriminatory, diligent, non-arbitrary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 548 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. The terms and conditions of providers of intermediary services may exclude legal information from those services or otherwise limit the access to legal information or the access and other rights of those exchanging it only where objectively justified and on clearly defined grounds.
Amendment 552 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2b. Terms and conditions of providers of intermediary services shall respect the essential principles of human rights as enshrined in the Charter and international law.
Amendment 555 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2c. Terms that do not comply with this Article shall not be binding on recipients.
Amendment 569 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
Article 13 – paragraph 1 – point c
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures, as well as the measures taken to qualify content moderators and to ensure that non- infringing content is not affected;
Amendment 582 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13a Online interface design 1. The use of dark patterns by providers of intermediary services when presenting options to or interacting with recipients of the service through their online interfaces is prohibited. 2. A choice or decision made by the recipient of the service using online interfaces that do not comply with the requirements of paragraph 1 shall not constitute consent. 3. The Commission shall publish official guidelines including a list of specific design patterns that qualify as subverting or impairing the autonomy, decision-making, or choice of the recipients of the service. The Commission shall keep this list updated in the light of technological developments and, in the case of very large online plat-forms, assessments related to adverse impacts identified in accordance with Article 27(2).
Amendment 598 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
Article 14 – paragraph 2 – point c
Amendment 599 #
Proposal for a regulation
Article 14 – paragraph 2 – point c a (new)
Article 14 – paragraph 2 – point c a (new)
(ca) where an alleged infringement of an intellectual property right is notified, evidence that the entity submitting the notice is the rights holder of the intellectual property right that is allegedly infringed or is authorised to act on behalf of that rights holder;
Amendment 600 #
Proposal for a regulation
Article 14 – paragraph 2 – point d
Article 14 – paragraph 2 – point d
(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.
Amendment 601 #
Proposal for a regulation
Article 14 – paragraph 2 – point d – indent 1 (new)
Article 14 – paragraph 2 – point d – indent 1 (new)
– The individual or entity may optionally provide their name and an electronic mail address which shall not be disclosed to the content provider except in cases of alleged violations of intellectual property rights.
Amendment 602 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
Amendment 608 #
Proposal for a regulation
Article 14 – paragraph 4 a (new)
Article 14 – paragraph 4 a (new)
4a. Upon receipt of the notice, the service provider shall notify the information providers of the elements referred to in paragraph 2 and give them the opportunity to reply before taking a decision.
Amendment 609 #
Proposal for a regulation
Article 14 – paragraph 4 b (new)
Article 14 – paragraph 4 b (new)
4b. Notified information shall remain accessible until a decision is taken in respect thereof. Providers of intermediary services shall not be held liable for failure to remove notified information while the assessment of legality is still pending.
Amendment 610 #
Proposal for a regulation
Article 14 – paragraph 4 c (new)
Article 14 – paragraph 4 c (new)
4c. The provider shall ensure that decisions on notices are taken by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where necessary, the opportunity to seek professional support, qualified psychological assistance and qualified legal advice.
Amendment 611 #
Proposal for a regulation
Article 14 – paragraph 5
Article 14 – paragraph 5
5. The provider shall also, without undue delay, notify thate submitting individual or entity as well as the information provider of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision.
Amendment 617 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objectivenon-arbitrary manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 626 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to or restrict proposals by recommender systems of specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipientor restricting proposals of that information, it shall inform the recipient, where he or she provided contact details , at the latest at the time of the removal or disabling of access or the restricting of proposals, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 629 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to or the restricting of proposals by recommender systems of, the information and, where relevant, the territorial scope of the disabling of access; or the restricting of proposals;
Amendment 633 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means used in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
Amendment 636 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible and machine-readable database managed by the Commission. That information shall not contain personal data.
Amendment 640 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15a Content moderation 1. Providers of hosting services shall not use ex-ante control measures based on automated tools or upload-filtering of content for content moderation. Where providers of hosting services otherwise use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are provided to staff, and that , where necessary, they are given the opportunity to seek professional support, qualified psychological assistance and qualified legal advice. This paragraph shall not apply to moderating information which has most likely been provided by automated tools. 2. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory , diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 643 #
Proposal for a regulation
Article 16 – title
Article 16 – title
Exclusion for micro and small enterprises
Amendment 646 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 651 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service and qualified entities as defined in Article 3, point (4) of Directive (EU) 2020/18281a of the European Parliament and of the Council , for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform: _________________ 1aDirective (EU) 2020/1828 onf the ground that the information provided by the recipients is illegal content or incompatible with its tEuropean Parliament and of the Council of 25 November 2020 on representative actions for the protection of the collective interests of consumerms and conditions:repealing Directive 2009/22/EC (OJ L 409, 4.12.2020, p. 1).
Amendment 653 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or disable access to or restrict proposals by recommender systems of the information;
Amendment 671 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objectivenon-arbitrary manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not manifestly illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 678 #
Proposal for a regulation
Article 18 – paragraph 1 – introductory part
Article 18 – paragraph 1 – introductory part
1. Recipients of the service addressed by the decisions referred to in Article 17(1) and qualified entities as defined in Article 3, point (4) of Directive (EU) 2020/1828, shall be entitled to select any out-of- court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 686 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and its members are remunerated in a way that is not linked to the outcome of the procedure;
Amendment 689 #
Proposal for a regulation
Article 18 – paragraph 2 – point a a (new)
Article 18 – paragraph 2 – point a a (new)
(aa) it is composed of legal experts;
Amendment 690 #
Proposal for a regulation
Article 18 – paragraph 2 – point b a (new)
Article 18 – paragraph 2 – point b a (new)
(ba) natural persons with responsibility for dispute settlement commit not to work for the online platform or a professional organisation or business association of which the online platform is a member for a period of three years after their position in the body has ended;
Amendment 691 #
Proposal for a regulation
Article 18 – paragraph 2 – point b b (new)
Article 18 – paragraph 2 – point b b (new)
(bb) natural persons with responsibility for dispute resolution must not have worked for an online platform or a professional organisation or business association of which the online platform is a member for a period of two years before taking up their position in the body;
Amendment 695 #
Proposal for a regulation
Article 18 – paragraph 2 – point e
Article 18 – paragraph 2 – point e
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure which are easily and publicly accessible.
Amendment 698 #
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
Article 18 – paragraph 3 – introductory part
3. If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement. Out-of-court dispute settlement procedures shall preferably be free of charge for the recipient of the service. In the event that costs are applied, the procedure shall be ac-accessible, attractive and inexpensive for recipients of the service. To that end, costs shall not exceed a nominal fee.
Amendment 704 #
Proposal for a regulation
Article 18 a (new)
Article 18 a (new)
Amendment 712 #
Proposal for a regulation
Article 19 – paragraph 1 a (new)
Article 19 – paragraph 1 a (new)
1a. Online platforms shall take the necessary technical and organisational measures to ensure that notices of incorrect removal, disabling access to or restricting proposals by recommender systems of content or of suspensions or terminations of accounts, submitted by trusted flaggers, are processed and decided upon with priority and without delay.
Amendment 725 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
Article 19 – paragraph 2 – point c a (new)
(ca) it publishes, at least once a year, clear, easily comprehensible and detailed reports on the notices submitted in accordance with Article 14 during the relevant period.
Amendment 737 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices or notices regarding legal content through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 745 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
Amendment 756 #
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
Article 20 – paragraph 3 – introductory part
3. Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following:
Amendment 757 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
Amendment 766 #
Proposal for a regulation
Article 20 – paragraph 4
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
Amendment 770 #
Proposal for a regulation
Article 21 – paragraph 1
Article 21 – paragraph 1
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take placef persons is imminent, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevantthe information availablethat gave rise to it.
Amendment 772 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.
Amendment 804 #
Right to lodge a complaint and right to an effective judicial remedy
Amendment 810 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Article 43 – paragraph 1 a (new)
Without prejudice to any other administrative or non-judicial remedy, each natural or legal person shall have the right to an effective judicial remedy against a legally binding decision of a Digital Services Coordinator concerning them.
Amendment 811 #
Proposal for a regulation
Article 43 – paragraph 1 b (new)
Article 43 – paragraph 1 b (new)
Amendment 814 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
Amendment 853 #
Proposal for a regulation
Article 26 – title
Article 26 – title
Amendment 857 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risksadverse impact stemming from the functioning and use made of their services in the Union. This riskimpact assessment shall be specific to their services and shall include the following systemic risks:
Amendment 862 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
(a) the dissemination of manifestly illegal content through their services;
Amendment 871 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 874 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of manifestly illegal content and of information that is incompatible with their terms and conditions.
Amendment 878 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
Article 26 – paragraph 2 a (new)
2a. The outcome of the impact assessment and supporting documents shall be communicated to the Board of Digital Service Coordinators and the Digital Services Coordinator of establishment. A summary version of the impact assessment shall be made publicly available in an easily accessible format.
Amendment 879 #
Proposal for a regulation
Article 27 – title
Article 27 – title
Mitigation of riskadverse impacts
Amendment 880 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shallmay put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risksadverse impact identified pursuant to Article 26, where mitigation is possible without adversely impacting other fundamental rights. Such measures may include, where applicable:
Amendment 887 #
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
Article 27 – paragraph 1 – point a a (new)
(aa) appropriate technical and operational measures or capacities, such as appropriate staffing or technical means to expeditiously remove or disable access to illegal content which the platform is aware of;
Amendment 888 #
Proposal for a regulation
Article 27 – paragraph 1 – point a b (new)
Article 27 – paragraph 1 – point a b (new)
(ab) easily accessible and user-friendly mechanisms for users to report or flag allegedly illegal content, and mechanisms for user moderation;
Amendment 890 #
Proposal for a regulation
Article 27 – paragraph 1 – point c
Article 27 – paragraph 1 – point c
(c) reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic risk;
Amendment 891 #
Proposal for a regulation
Article 27 – paragraph 1 – point d
Article 27 – paragraph 1 – point d
Amendment 895 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
Article 27 – paragraph 1 – point e
Amendment 896 #
Proposal for a regulation
Article 27 – paragraph 1 – subparagraph 1 (new)
Article 27 – paragraph 1 – subparagraph 1 (new)
The decision as to the choice of measures shall remain with the platform.
Amendment 901 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic riskadverse impacts reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 905 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to mitigate the systemic riskadverse impacts identified.
Amendment 909 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelinerecommendations on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommendpropose possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelinerecommendations the Commission shall organise public consultations.
Amendment 913 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:
Amendment 916 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III;
Amendment 917 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
Amendment 924 #
Proposal for a regulation
Article 28 – paragraph 4
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
Amendment 928 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, meaningful information about the logic involved and the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679 and which is activated by default. Basing recommender systems on profiling shall require the explicit consent of the recipient, as defined in Article 4, point (11), of Regulation (EU) 2016/679.
Amendment 943 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2a. Very large online platforms that use recommender systems shall allow the recipient of the service to have information presented to them in chronological order only and alternatively, where technically possible, to use third-party recommender systems. Third-party recommender systems shall have access to the same information that is available to the recommender systems used by the platform. They shall process this information only to provide recommendations to the recipient.
Amendment 971 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
Amendment 1001 #
Proposal for a regulation
Article 33 – paragraph 2 – point a
Article 33 – paragraph 2 – point a
(a) a report setting out the results of the riskimpact assessment pursuant to Article 26;
Amendment 1002 #
Proposal for a regulation
Article 33 – paragraph 2 – point b
Article 33 – paragraph 2 – point b
(b) the related risk mitigation measures identified and implemented pursuant to Article 27;
Amendment 1007 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33a Interoperability 1. By 31 December 2024 very large online platforms shall make the main functionalities of their services interoperable with other online platforms to enable cross-platform exchange of information. This obligation shall not limit, hinder or delay their ability to solve security issues. The cross-platform exchange of information shall require the informed consent of the recipients exchanging information. Online platforms shall not process information obtained for the purpose of cross-platform information exchange for other purposes. Very large online platforms shall publicly document all application programming interfaces they make available. 2. The Commission shall adopt implementing acts specifying the nature and scope of the obligations set out in paragraph 1. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70.
Amendment 1015 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shall encourage andmay facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic riskadverse impacts, in accordance with Union law, in particular on competition and the protection of personal data.
Amendment 1020 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic riskadverse impacts within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1022 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
Amendment 1027 #
Proposal for a regulation
Article 35 – paragraph 4
Article 35 – paragraph 4
4. The Commission and the Board shallmay assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shallmay regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions.
Amendment 1029 #
Proposal for a regulation
Article 35 – paragraph 5
Article 35 – paragraph 5
5. The Board shallmay regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.
Amendment 1033 #
Proposal for a regulation
Article 36 – paragraph 1
Article 36 – paragraph 1
1. The Commission shall encourage andmay facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30.
Amendment 1039 #
Proposal for a regulation
Article 36 – paragraph 3
Article 36 – paragraph 3
Amendment 1045 #
Proposal for a regulation
Article 37 – paragraph 1
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
Amendment 1046 #
Proposal for a regulation
Article 37 – paragraph 2 – introductory part
Article 37 – paragraph 2 – introductory part
2. The Commission shallmay encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures:
Amendment 1047 #
Proposal for a regulation
Article 37 – paragraph 3
Article 37 – paragraph 3
3. The Commission may involve, as appropriate, Member States’ authorities and Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.
Amendment 1048 #
Proposal for a regulation
Article 37 a (new)
Article 37 a (new)
Amendment 1064 #
Proposal for a regulation
Article 43 – title
Article 43 – title
Right to lodge a complaint and right to an effective judicial remedy
Amendment 1071 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Article 43 – paragraph 1 a (new)
Without prejudice to any other administrative or non-judicial remedy, each natural or legal person shall have the right to an effective judicial remedy against a legally binding decision of a Digital Services Coordinator concerning them.
Amendment 1073 #
Proposal for a regulation
Article 43 – paragraph 1 b (new)
Article 43 – paragraph 1 b (new)
Without prejudice to any other administrative or non-judicial remedy, each recipient shall have the right to an effective judicial remedy where the competent Digital Services Coordinator does not handle a complaint or does not inform the recipient within three months on the progress or outcome of the complaint lodged pursuant to paragraph 1.
Amendment 1077 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
Article 44 – paragraph 2 – point a
(a) the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;
Amendment 1116 #
Proposal for a regulation
Article 50 – paragraph 2
Article 50 – paragraph 2
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35.
Amendment 1119 #
Proposal for a regulation
Article 50 – paragraph 3 – subparagraph 1
Article 50 – paragraph 3 – subparagraph 1
Where the Digital Services Coordinator of establishment has concerns on the ability of the measures to terminate or remedy the infringement, it may request the very large online platform concerned to subject itself to an additional, independent audit to assess the effectiveness of those measures in terminating or remedying the infringement. In that case, that platform shall send the audit report to that Digital Services Coordinator, the Commission and the Board within four months from the decision referred to in the first subparagraph. When requesting such an additional audit, the Digital Services Coordinator may specify a particular audit organisation that is to carry out the audit, at the expense of the platform concerned, selected on the basis of criteria set out in Article 28(2).
Amendment 1130 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15a Providers of hosting services shall not use ex-ante control measures based on automated tools or upload-filtering of content for content moderation. Where providers of hosting services use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and on going training on the applicable legislation and international human rights standards as well as appropriate working conditions are provided to staff. This paragraph shall not apply to moderating information which has most likely been provided by automated tools.
Amendment 1133 #
Proposal for a regulation
Article 56 – paragraph 1
Article 56 – paragraph 1
1. If, during proceedings under this Section, the very large online platform concerned offers lawful commitments to ensure compliance with the relevant provisions of this Regulation, the Commission may by decision make those commitments binding on the very large online platform concerned and declare that there are no further grounds for action.
Amendment 1134 #
Proposal for a regulation
Article 56 – paragraph 2 – point b
Article 56 – paragraph 2 – point b
(b) where the very large online platform concerned acts contrary to its lawful commitments; or
Amendment 1144 #
Proposal for a regulation
Article 68 – paragraph 1 – introductory part
Article 68 – paragraph 1 – introductory part
Without prejudice to Directive 2020/XX/EU of the European Parliament and of the Council52 , recipients of intermediary services shall have the right to mandate a body, organisation or association to exercise the rights referred to in Articles 12, 13, 14, 15, 17, 18 and 19 on their behalf, provided the body, organisation or association meets all of the following conditions: _________________ 52 [Reference]