Activities of Karen MELCHIOR related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (continuation of debate)
Shadow opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Amendments (682)
Amendment 29 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to gender equality and non- discrimination. Children, especially girls, have specific rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child. As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
Amendment 49 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as health, including mental health, the safety and trust of the recipients of the service, including minors and vulnerable users, women, LGBTIQ+ people and vulnerable users such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. The World Health Organisation defines ‘health’ as a state of complete physical, mental and social well- being and not merely the absence of disease or infirmity. This definition supports the fact that the development of new technologies might bring new health risks to users, in particular for children and women, such as psychological risk, development risks, mental risks, depression, loss of sleep, or altered brain function.
Amendment 59 #
Proposal for a regulation
Recital 41
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
Amendment 69 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 86 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including gender equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 92 #
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
Article 2 – paragraph 1 – point d a (new)
(d a) ‘child’ means any natural person under the age of 18;
Amendment 100 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
1 a. Providers of intermediary services shall ensure their terms and conditions are age-appropriate, promote gender equality and the rights of LGBTIQ+ people and meet the highest European or International standards, pursuant to Article 34.
Amendment 100 #
Proposal for a regulation
Recital 4 a (new)
Recital 4 a (new)
(4 a) As Party to the United Nations Convention on the Rights of Persons with Disabilities (UN CRPD), provisions of the Convention are integral part of the Union legal order and binding upon the Union and its Member States. The UN CRPD requires its Parties to take appropriate measures to ensure that persons with disabilities have access, on an equal basis with others, to information and communications technologies and systems, and other facilities and services open or provided to the public, both in urban and in rural areas. General Comment No2 to the UN CRPD further states that “The strict application of universal design to all new goods, products, facilities, technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity.”Given the ever-growing importance of digital services and platforms in private and public life, in line with the obligations enshrined in the UN CRPD, the EU must ensure a regulatory framework for digital services which protects rights of all recipients of services, including persons with disabilities.
Amendment 101 #
Proposal for a regulation
Recital 5 a (new)
Recital 5 a (new)
(5 a) Given the cross-border nature of the services at stake, EU action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and ensure that equal right to access and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for intermediary services, including their user interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals.
Amendment 102 #
Proposal for a regulation
Recital 5 b (new)
Recital 5 b (new)
(5 b) The notions of ‘access’ or ‘accessibility’ are often referred to with the meaning of affordability (financial access), availability, or in relation to access to data, use of network, etc. It is important to distinguish these from ‘accessibility for persons with disabilities’ which means that services, technologies and products are perceivable, operable, understandable and robust for persons with disabilities.
Amendment 105 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Article 12 a Child impact assessment 1. All providers shall assess whether their services are accessed by, likely to be accessed by, or impact children, especially girls. Providers of services likely to impact children, especially girls, shall identify, analyse and assess, during the design and development of new services, on an ongoing basis and at least once a year, any systemic risks stemming from the functioning and use of their services in the Union for children, especially girls. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments shall also include the following systemic risks: (a) the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; (b) any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and in the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No.25 as regards the digital environment; (c) any negative effects on the the right to gender equality, as enshrined in Article 23 of the Charter, particularly the right to live free from violence as envisaged by the Council of Europe Convention on preventing and combating violence against women and girls (Istanbul Convention); (d) any negative effects on the right to non-discrimination, as enshrined in Article 21 of the Charter; (e) any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on children's rights, especially of girls. 2. When conducting such impact assessments, providers of intermediary services likely to impact children, especially girls, shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child, especially of girls.
Amendment 105 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, among others, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. To assist Member States and providers, the Commission should provide guidelines as to how to interpret the interaction between different Union acts and how to prevent any duplication of requirements on providers or potential conflicts in the interpretation of similar requirements. _________________ 28Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 106 #
Article 12 b Mitigation of risks to children, especially girls Providers of intermediary services likely to impact children, especially girls, shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 12 a. Such measures shall include, where applicable: (a) implementing mitigation measures identified in Article 27 with regard for children’s best interests; (b) adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; (c) implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; (d) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child and gender equality; (e) ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; (f) preventing profiling of children, including for commercial purposes like targeted advertising; (g) ensuring published terms are age appropriate and uphold children’s rights and gender equality; (h) providing child-friendly and inclusive mechanisms for remedy and redress, including easy access to expert advice and support.
Amendment 110 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, gender and (if children) the age of complainants, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 111 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
Article 13 – paragraph 1 a (new)
1 a. Providers of intermediary services that impact children, especially girls, shall publish, at least once a year: (a) child impact assessments to identify known harms, unintended consequences and emerging risks; these impact assessments shall comply with the standards outlined in Article 34; (b) clear, easily comprehensible and detailed reports outlining the gender equality and child risk mitigation measures undertaken, their efficacy and any outstanding actions required; these reports shall comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child-centred design that equally promotes gender equality.
Amendment 116 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or neutral hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks or providers of services deeper in the internet stack, such as IT infrastructure services (on-premise, cloud-based and or hybrid hosting solutions), that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. Services deeper in the internet stack acting as online intermediaries could be required to take proportionate actions where the customer fails to remove the illegal content, unless technically impracticable.
Amendment 117 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadappropriately and also covers information relating to illegal content, products, services and activities where such information is itself illegal. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable lawUnion or national law as a result of its display on an intermediary service is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegaldue to its direct connection to or promotion of an illegal activity, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non- compliant or counterfeit products, illegal trading of animals, plants and substances, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law including the Charter of Fundamental Rights of the European Union and what the precise nature or subject matter is of the law in question.
Amendment 120 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact- finding obligation, or as a general obligation forimpeding upon the ability of providers to undertake proactive measures to relation to illegal contentidentify and remove illegal content and to prevent its reappearance.
Amendment 124 #
Proposal for a regulation
Article 17 – paragraph 2
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling and redress systems are easy to access, and user-friendly, including for children, especially girls and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.
Amendment 126 #
Proposal for a regulation
Recital 12 a (new)
Recital 12 a (new)
(12 a) Material disseminated for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering illegal content including the content which represents an expression of polemic or controversial views in the course of public debate should not be considered as illegal content. Similarly, material, such as an eye-witness video of a potential crime, should not be considered as illegal, merely because it depicts an illegal act. An assessment should determine the true purpose of that dissemination and whether material is disseminated to the public for those purposes.
Amendment 128 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 130 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
Amendment 132 #
Proposal for a regulation
Recital 13 a (new)
Recital 13 a (new)
(13 a) Additionally in order to avoid imposing obligations simultaneously on two providers for the same content, a hosting services should be defined as an online platform when it has a direct relationship with the recipient of the service. A hosting provider who is acting as the infrastructure for an online platform should not be considered as an online platform based on this relationship, where it implements the decisions of the online platform and its user indirectly.
Amendment 144 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
Amendment 149 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1 a. Where a very large online platform decides not to put in place any of the mitigation measures listed in article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report referred to in Article 28(3).
Amendment 151 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27;
Amendment 158 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
2 a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child and the right to gender equality, observance of which, once adopted, will be mandatory, at least for the following: (a) age assurance and age verification pursuant to Article 13; (b) child impact assessments pursuant to Article 13; (c) age-appropriate terms and conditions that equally promote gender equality pursuant to Article 12; (d) child-centred design that equally promotes gender equality and pursuant to Article 13.
Amendment 165 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case, where set down in Union acts and, in particular, does not affect orders by national authorities in accordance with national legislation that implements European acts, in accordance with the conditions established in this Regulation and other European lex specialis. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Equally, nothing in this Regulation should prevent providers from enacting end-to-end encrypting of their services.
Amendment 167 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
Amendment 168 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
Amendment 169 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
Article 51 – paragraph 2 – introductory part
2. Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
Amendment 174 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the effective processing of those orders.
Amendment 179 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute manifestly illegal content in the majority of other Member States concerned and if the content is illegal within the Member State of establishment of a hosting provider and, where relevant, take account of the relevant rules of national, Union law or international law and the interests of international comity.
Amendment 182 #
Proposal for a regulation
Recital 33
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders to provide information in question relate to specific items of illegal content and information as defined in Union or national law, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. Article 3 of Directive 2000/31/EC, however, continues to apply to any other orders related to non-specific individual items of illegal or legal content or information, general orders related to geoblocking of whole websites, webpages, or domains and any other matter which could be seen as restricting the freedom to provide their service across border.
Amendment 185 #
Proposal for a regulation
Recital 2 a (new)
Recital 2 a (new)
(2a) Member States also undertake to promote, through multilateral agreements such as the International Partnership for Information and Democracy initiated by Reporters Without Borders and signed by 21 EU Member States, the Regulation of the public information and communication space by establishing democratic guarantees for the digital space, based on the responsibility of platforms and guarantees for the reliability of information. These multilateral commitments offer convergent solutions on matters covered by this Regulation.
Amendment 185 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors, women and vulnerable users, such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 186 #
Proposal for a regulation
Recital 35
Recital 35
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large onlSimilarly, in order to make sure that the obligations are only applied to those providers of intermediary services where the benefit would outweigh the burden on the provider, the Commission should be empowered to issue a waiver to the requirements of chapter III, in whole or ine platforms. To the extent thatarts, to those providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights onlinethat are non-for- profit or equivalent and serve a manifestly positive role in the public interest, or are SMEs without any systemic risk related to illegal content. The Providers shall present justified reasons for why they should be issued a waiver. The Commission should examine such an application and has the authority to issue or revoke a waiver at any time. The Commission should maintain a public list of all waivers issued and their conditions containing a description on why the provider is justified a waiver.
Amendment 191 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. At the same time, recipients should enter into such agreements willingly without any misleading or coercive tactics and therefore a ban on dark patterns should be introduced.
Amendment 193 #
Proposal for a regulation
Recital 38 a (new)
Recital 38 a (new)
(38 a) While an additional requirement should apply to very large online platforms, all providers should do a general self-assessment of potential risks related to their services, especially in relations with minors and should take voluntary mitigation measures where appropriate. In order to ensure that the provider undertakes these actions, Digital Services Coordinators may ask for proof.
Amendment 194 #
Proposal for a regulation
Recital 4 a (new)
Recital 4 a (new)
(4a) As Party to the United Nations Convention on the Rights of Persons with Disabilities (UN CRPD), provisions of the Convention are integral part of the Union legal order and binding upon the Union and its Member States. The UN CRPD requires its Parties to take appropriate measures to ensure that persons with disabilities have access, on an equal basis with others, to information and communications technologies and systems, and other facilities and services open or provided to the public, both in urban and in rural areas. General Comment No2 to the UN CRPD further states that “The strict application of universal design to all new goods, products, facilities, technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity.” Given the ever-growing importance of digital services and platforms in private and public life, in line with the obligations enshrined in the UN CRPD, the EU must ensure a regulatory framework for digital services which protects rights of all recipients of services, including persons with disabilities.
Amendment 194 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 which do not also qualify as very large online platforms. In any public versions of such reports, providers of intermediary services should remove any information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions. _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 199 #
Proposal for a regulation
Recital 5 a (new)
Recital 5 a (new)
(5a) Given the cross-border nature of the services at stake, Union action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and ensure that equal right to access and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for intermediary services, including their user interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals.
Amendment 201 #
Proposal for a regulation
Recital 5 b (new)
Recital 5 b (new)
(5b) The notions of ‘access’ or ‘accessibility’ are often referred to with the meaning of affordability (financial access), availability, or in relation to access to data, use of network, etc. It is important to distinguish these from ‘accessibility for persons with disabilities’ which means that services, technologies and products are perceivable, operable, understandable and robust for persons with disabilities.
Amendment 202 #
Proposal for a regulation
Recital 40 a (new)
Recital 40 a (new)
(40 a) Notices should be directed to the actor that has the technical and operational ability to act and the closest relationship to the recipient of the service that provided the information or content, such as to an online platform and not to the hosting service provider on which provides services to that online platform. Such hosting service providers should redirect such notices to the particular online platform and inform the notifying party of this fact.
Amendment 203 #
Proposal for a regulation
Recital 40 b (new)
Recital 40 b (new)
(40 b) Hosting providers should seek to act only against the items of information notified. This may include acts such as disabling hyperlinking to the items of information. Where the removal or disabling of access to individual items of information is technically or operationally unachievable due to legal, contractual, or technological reasons, such as encrypted file and data storage and sharing services, hosting providers should inform the recipient of the service of the notification and seek action. If a recipient fails to act or delays action, or the provider has reason to believe has failed to act or otherwise acts in bad faith, the hosting provider may suspend their service in line with their terms and conditions.
Amendment 204 #
Proposal for a regulation
Recital 7
Recital 7
(7) In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to providers of intermediary services irrespective of their place of establishment or residence, in so far as they provide and direct services at and in the Union, as evidenced by a substantial connection to the Union.
Amendment 205 #
Proposal for a regulation
Recital 41
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
Amendment 206 #
Proposal for a regulation
Recital 8
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of active monthly users in one or more Member States, or the targeproactive directing of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeusing a national top level domain or intermediary service provider solicits the conclusion of distance contracts from residents of the Union and that a contract has actually been concluded at a distance, by whatever means. In this respect, the language or currency which a website uses does not constitute a relevant factor. The proactive directing of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. The mere availability of a service in a Member State should not be considered as a proactive offering of a service by the provider. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union or the use of an international language of more than 100 Million native speakers cannot, on thatose ground alone, be considered as establishing a substantial connection to the Union. __________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1). (The exact number which equals significant should be fixed during negotiations)
Amendment 206 #
Proposal for a regulation
Recital 41 a (new)
Recital 41 a (new)
(41 a) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, either because it is illegal or is not allowed under its terms and conditions, it should do so in a timely manner, taking into account the potential harm of the infraction and the technical abilities of the provider. Information that could have a negative effect on minors, women and vulnerable users such as those with protected characteristics under Article 21 of the Charter should be seen as a matter requiring urgency
Amendment 208 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. Such a statement, however, should not be required if it relates to spam, manifestly illegal content, removal of content similar or identical to content already removed from the same recipient, who has already received a statement or where a provider of hosting service does not have the information necessary to inform the recipient by a durable medium.
Amendment 214 #
Proposal for a regulation
Article 1 – paragraph 2 – point b b (new)
Article 1 – paragraph 2 – point b b (new)
(b b) stimulate the level playing field of the online ecosystem by introducing interoperability requirements for very large platforms.
Amendment 214 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, among others, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. To assist Member States and providers, the Commission should provide guidelines as to how to interpret the interaction between different Union acts and how to prevent any duplication of requirements on providers or potential conflicts in the interpretation of similar requirements. __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 215 #
Proposal for a regulation
Recital 43
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 219 #
Proposal for a regulation
Recital 10
Recital 10
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council.35 Directive (EU) 2019/882 of the European Parliament and of the Council, and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions. __________________ 30Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). 31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). 32Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37. 33Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. 34Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 35Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. 36Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. 37Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules 38Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
Amendment 220 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of entering, in good faith, an out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies located in either the Member State of the recipient or the provider and that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 225 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should normally only be awarded to non-governmental entities, and not individualnatural persons, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities, however, can be public in nature for actions not related to intellectual property rights, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, non- governmental organisations of industry and of right- holders could also be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 227 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable lawUnion or national law as a result of its display on an intermediary service is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegaldue to its direct connection to or promotion of an illegal activity, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non- compliant or counterfeit products, illegal trading of animals, plants and substances, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 230 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law, including the EU Charter on Fundamental Rights and what the precise nature or subject matter is of the law in question.
Amendment 231 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and effective safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 232 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a an imminent threat to the life or safety of person, notably when it concerns vulnerable users, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing upon request all relevant information available to it, including where relevant the content in question and an explanation of its suspicion and unless instructed otherwise, should remove or disable the content. Information obtained by a law enforcement or judicial authority of a Member State in accordance with this Article should not be used for any purpose other than those directly related to the individual serious criminal offence notified. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 239 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, search engines, livestreaming platforms, messaging services or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
Amendment 240 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a user- friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . The online interface should allow traders to provide the information referred to in Article 22a of this Regulation, the information referred to in Article 6 of Directive2011/83/EU on Consumers Rights, information on sustainability of products, and information allowing for the unequivocal identification of the product or the service, including labelling requirements, in compliance with legislation on product safety and product compliance. _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 241 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or, search engines, online marketplaces, and messaging services used as sales channels should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
Amendment 242 #
Proposal for a regulation
Recital 13 a (new)
Recital 13 a (new)
(13a) Additionally in order to avoid imposing obligations simultaneously on two providers for the same content, a hosting service should only be deemed an online platform when it has a direct relationship with the recipient of the service. A hosting provider who is acting as the infrastructure for an online platform should not be considered as an online platform based on this relationship, where it implements the decisions of the online platform and its user indirectly.
Amendment 242 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
(50 a) Providers of online marketplaces should demonstrate their best efforts to prevent the dissemination by traders of illegal products and services. In compliance with the no general monitoring principle, providers should inform recipients when the service or product they have acquired through their services is illegal. Once notified of an illegal product or service as foreseen in Article 14, providers of online marketplaces should take effective and proportionate measures to prevent such products or services from reappearing on their online marketplace.
Amendment 243 #
Proposal for a regulation
Recital 13 b (new)
Recital 13 b (new)
(13b) For the purpose of this Regulation, a cloud computing service should not considered as an ‘online platform’ where allowing the dissemination of hyperlinks to a specific content is a minor and ancillary feature. Moreover a cloud computing service when serving as infrastructure, for example as the underlining infrastructural storage and computing services of an internet- based application or online platform, should not in itself be seen as disseminating to the public information stored or processed at the request of a recipient of an application or online platform which it hosts.
Amendment 247 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 248 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outsideing within the scope of this Regulation should not be seen as disseminating to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Where multiple providers are involved in the dissemination of an information to the public, the obligations related to that disseminated should lay with the outward facing provider closest in relations to the accessibility by the end user recipient of the final service __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 258 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition or the misuse of the platforms' terms and conditions, including content moderation policies, when enforced, often through automatic means. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, fundamental rights, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 260 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditionprevent the manipulation and exploitation of the service, including by the amplification of content which is counter to their terms and conditions, adapting their decision-making processes, or adapting their terms and conditions and content moderation policies and how those policies are enforced, while being fully transparent to the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources, including by displaying related public service advertisements instead of other commercial advertisements. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 262 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. The mere ranking or displaying in an order, or the use of a recommender system should not, however, be deemed as having control over an information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
Amendment 269 #
Proposal for a regulation
Recital 21
Recital 21
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted. It also should not be understood to cover the ranking or sorting of information to make it accessible to a user or actions required to ensure the security of the transmissions.
Amendment 274 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The hosting services must take into account the harm that can potentially occur and act proportionally. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 280 #
Proposal for a regulation
Recital 63
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. In addition, very large online platforms should label any known deep fake videos, audio or other files.
Amendment 283 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, which mean the conditions set out in this Regulation. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 285 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online marketplaces which are online platforms that allow consumers to conclude distance contracts with traders on the online platform itself, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platformmarketplaces present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. This may include the storage, packing and shipment of a good from a warehouse under the control of the online marketplace. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well- informed consumer.
Amendment 293 #
Proposal for a regulation
Recital 25
Recital 25
Amendment 295 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Article 2 a Targeting of digital advertising 1. Providers of information society services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of determining the recipients to whom advertisements are displayed. 2. This provision shall not prevent information society services from determining the recipients to whom advertisements are displayed on the basis of contextual information such as keywords, the language setting communicated by the device of the recipient or the geographical region of the recipients to whom an advertisement is displayed. 3. The use of the contextual information referred to in paragraph2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of one or more natural persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or persons.
Amendment 309 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be and among others, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, Virtual Private Networks, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 311 #
Proposal for a regulation
Recital 27 a (new)
Recital 27 a (new)
(27a) A single webpage or website may include elements that qualify differently between ‘mere conduit’, ‘caching’ or hosting services and the rules for exemptions from liability should apply to each accordingly. For example, a search engine may act solely as a ‘caching’ service as to information included in the results of an inquiry. Elements displayed alongside those results, such as online advertisements, would however still meet the standard of a hosting service.
Amendment 312 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case, where set down in Union acts and, in particular, does not affect orders by national authorities in accordance with national legislation that implements Union acts, in accordance with the conditions established in this Regulation and other Union law regarded as lex specialis. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Equally, nothing in this Regulation should prevent providers from enacting end-to-end encrypting of their services.
Amendment 313 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. This should be without prejudice to decisions of Member States to require service providers, who host information provided by users of their service, to apply due diligence measures.
Amendment 314 #
Proposal for a regulation
Recital 76
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV and Article 8 and 9 by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.
Amendment 317 #
Proposal for a regulation
Recital 78
Recital 78
(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. In order to ensure coherence between the Member States, the Commission should adopt guidance on the procedures and rules related to the powers of Digital Services Coordinators.
Amendment 319 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
Amendment 319 #
Proposal for a regulation
Recital 85
Recital 85
(85) Where a Digital Services Coordinator requests another Digital Services Coordinator to take action, the requesting Digital Services Coordinator, or the Board in case it issued a recommendation to assess issues involving more than threefour Member States, should be able to refer the matter to the Commission in case of any disagreement as to the assessments or the measures taken or proposed or a failure to adopt any measures. TIf the Commission believes that the Digital Services Coordinator of establishment has not at least partially addressed the request or has not fully justified its decision to not address the request, the Commission, on the basis of the information made available by the concerned authorities, should accordingly be able to request the competent Digital Services Coordinator to re-assess the matter and take the necessary measures to ensure compliance within a defined time period. This possibility is without prejudice to the Commission’s general duty to oversee the application of, and where necessary enforce, Union law under the control of the Court of Justice of the European Union in accordance with the Treaties. A failure by the Digital Services Coordinator of establishment to take any measures pursuant to such a request may also lead to the Commission’s intervention under Section 3 of Chapter IV of this Regulation, where the suspected infringer is a very large online platform.
Amendment 322 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws in conformity with the Union law, including the EU Charter on Fundamental Rights on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to thensure the effective processing of those orders.
Amendment 322 #
Proposal for a regulation
Recital 88
Recital 88
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level and with legal personality, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member Statethe Member State´s Digital Services Coordinator.
Amendment 324 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the effective processing of those orders.
Amendment 327 #
Proposal for a regulation
Recital 30
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information. Nevertheless, the same relevant protections for providers and users granted in the Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online] should be provided here in order to ensure equivalent rules and protections for all types of content and information covered by such orders. This includes the ability of a provider to challenge an order before its Digital Services Coordinator of establishment and to seek a decision as to the effect to be given to the order. Digital Services Coordinator of establishment should be able to take a decision to suspend or limit the application of the order, where it views it as in conflict with Union or its national law.
Amendment 332 #
Proposal for a regulation
Article 14 – paragraph 2 – point d
Article 14 – paragraph 2 – point d
(d) a statement confirming the good faith beliefbest knowledge of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.
Amendment 332 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute manifestly illegal content in the majority of other Member States concerned and if the content is illegal within the Member State of establishment of a hosting provider and, where relevant, take account of the relevant rules of national, Union law or international law and the interests of international comity.
Amendment 333 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law in conformity with the Union law, including the EU Charter on Fundamental Rights enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of national, Union law or international law and the interests of international comity.
Amendment 336 #
Proposal for a regulation
Recital 97 a (new)
Recital 97 a (new)
(97 a) The Commission should ensure that it is independent and impartial in its decision making in regards to both Digital Services Coordinators and providers of services under this Regulation.
Amendment 337 #
Proposal for a regulation
Recital 98
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the Commission should have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties, including the right to challenge any investigative requests before a judicial authority within the Member State of establishment.
Amendment 338 #
Proposal for a regulation
Recital 33
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information as defined in Union or national law, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. Article 3 of Directive 200/31/EC, however, continues to apply to any other orders related to non-specific individual items of illegal or legal content or information, general orders related to geoblocking of whole websites, webpages, or domains and any other matter which could be seen as restricting the freedom to provide their service across border.
Amendment 339 #
Proposal for a regulation
Recital 33
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information as defined in Union or national law in conformity with the Union law, including the EU Charter on Fundamental Rights, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
Amendment 341 #
Proposal for a regulation
Recital 99
Recital 99
(99) In particular, the Commission, where it can show grounds for believing that a very large online platform is not compliant with this Regulation, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information related to those concerns. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers.
Amendment 342 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, health and trust of the recipients of the service, including minors, women and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empowerprovide recourse to recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 344 #
Proposal for a regulation
Recital 104
Recital 104
(104) In order to fulfil the objectives of this Regulation, the power to adopt acts in accordance with Article 290 of the Treaty should be delegated to the Commission to supplement this Regulation. In particular, delegated acts should be adopted in respect of criteria for identification of very large online platforms and of technical specifications for access requests. It is also equally important that, when standardisation bodies are unable to agree the standards needed to implement this Regulation fully, the Commission chooses to adopt delegated acts. It is of particular importance that the Commission carries out appropriate consultations and that those consultations be conducted in accordance with the principles laid down in the Interinstitutional Agreement on Better Law-Making of 13 April 2016. In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States' experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts.
Amendment 348 #
1. This Regulation lays down harmonised rules on the provision of intermediary services in the internal marketorder to improve the functioning of the internal market whilst ensuring the rights enshrined in the Charter of Fundamental Rights of the European Union, in particular the freedom of expression and information in an open and democratic society. In particular, it establishes:
Amendment 351 #
Proposal for a regulation
Recital 35
Recital 35
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation in relations to those services. Services that do not fall within those different categories should not be effected, even when provided by the same provider or under the same ownership structure. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.
Amendment 351 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniform harmonised rules for a safe, predictable, accessible and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 354 #
Proposal for a regulation
Recital 35 a (new)
Recital 35 a (new)
(35a) Similarly, in order to ensure that the obligations are only applied to those providers of intermediary services where the benefit would outweigh the burden on the provider, the Commission should be empowered to issue a waiver to the requirements of Chapter III, in whole or in parts, to those providers of intermediary services that are non-for- profit or equivalent and serve a manifestly positive role in the public interest, or are SMEs without any systemic risk related to illegal content. The providers should present justified reasons for why they should be issued a waiver. The Commission should examine such an application and has the authority to issue or revoke a waiver at any time. The Commission should maintain a public list of all waiver issued and their conditions containing a description on why the provider is justified a waiver.
Amendment 355 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. This contact point maybe the same contact point as required under other Union acts. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
Amendment 355 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
Article 1 – paragraph 4 a (new)
4 a. This Regulation shall respect the fundamental rights recognised by the Charter of Fundamental rights of the European Union and the fundamental rights constituting general principles of Union law. Accordingly, this Regulation may only be interpreted and applied in accordance with those fundamental rights, including the freedom of expression and information, as well as the freedom and pluralism of the media. When exercising the powers set out in this Regulation, all public authorities involved shall aim to achieve, in situations where the relevant fundamental rights conflict, a fair balance between the rights concerned, in accordance with the principle of proportionality.
Amendment 358 #
Proposal for a regulation
Article 1 – paragraph 5 – point b
Article 1 – paragraph 5 – point b
(b) Directive 2010/13/ECU as amended by Directive 2018/1808/EU;
Amendment 359 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. Where providers of intermediary services that are established in a third country chooses not to do not, it becomes subject to the jurisdiction of all Member States, in accordance with Article 40(3).
Amendment 360 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
Article 1 – paragraph 5 – point c
(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 on Copyright and Related Rights in Digital Single Market;
Amendment 363 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. At the same time, recipients should enter into such agreements willingly without any misleading or coercive tactics and therefore a ban on dark patterns should be introduced.
Amendment 363 #
Proposal for a regulation
Article 1 – paragraph 5 – point h
Article 1 – paragraph 5 – point h
(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation (EU) 2019/1020 and Regulation XXX (General Product Safety Regulation);
Amendment 364 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(i a) Directive (EU) 2019/882
Amendment 366 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
Article 1 – paragraph 5 a (new)
5 a. The Commission shall by [within one year of the adoption of this Regulation] publish guidelines with regard to the relations between this Regulation and legislative acts listed in Article 1(5). These guidelines shall clarify any potential conflicts between the conditions and obligations listed in those legislative acts and which act prevails where actions, in line with this Regulation, fulfil the obligations of another legislative act and which regulatory authority is competent.
Amendment 369 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 which do not also qualify as very large online platforms. In any public versions of such reports, providers of intermediary services should remove any information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions. __________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 382 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law or due to its connection to or promotion of an illegal activity, including the sale of products, substances, animals or plants, or provision of services, directly leads to the dissemination to the public of such an illegal content. Material disseminated for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering illegal content including the content which represents an expression of polemic or controversial views in the course of public debate shall not be considered as illegal content. An assessment shall determine the true purpose of that dissemination and whether material is disseminated to the public for those purposes.
Amendment 383 #
Proposal for a regulation
Recital 40 a (new)
Recital 40 a (new)
(40a) Nevertheless, notices should be directed to the actor that has the technical and operational ability to act and the closest relationship to the recipient of the service that provided the information or content, such as to an online platform and not to the hosting service provider on which provides services to that online platform. Such hosting service providers should redirect such notices to the particular online platform and inform the notifying party of this fact.
Amendment 384 #
Proposal for a regulation
Recital 40 b (new)
Recital 40 b (new)
(40b) Moreover, hosting providers should seek to act only against the items of information notified. This may include acts such as disabling hyperlinking to the items of information. Where the removal or disabling of access to individual items of information is technically or operationally unachievable due to legal, contractual, or technological reasons, such as encrypted file and data storage and sharing services, hosting providers should inform the recipient of the service of the notification and seek action. If a recipient fails to act or delays action, or the provider has reason to believe has failed to act or otherwise acts in bad faith, the hosting provider may suspend their service in line with their terms and conditions.
Amendment 387 #
Proposal for a regulation
Recital 41 a (new)
Recital 41 a (new)
(41a) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, either because it is illegal or is not allowed under its terms and conditions, it should do so in a timely manner, taking into account the potential harm the infraction and the technical abilities of the provider. Information that could have a negative effect on minors, women and vulnerable users such as those with protected characteristics under Article 21 of the Charter should be seen as a matter requiring urgency.
Amendment 388 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that have been proven to be efficient, proportionate and reliable, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. Such an statement, however, should not be required if it relates to spam, manifestly illegal content, removal of content similar or identical to content already removed from the same recipient who has already received a statement or where a provider of hosting service does not have the information necessary to inform the recipient by a durable medium.
Amendment 398 #
Proposal for a regulation
Recital 42 a (new)
Recital 42 a (new)
(42a) Due to the international nature of many providers of hosting services, many have already implemented similar requirements under the laws of third- party countries. In order to prevent a doubling of requirements and the removal of existing systems for recipients, the Commission should be empowered to declare these mechanisms as ensuring an adequate level of protection and fulfilling the requirements in Article 14 and Article 15.
Amendment 401 #
Proposal for a regulation
Article 2 – paragraph 1 – point q
Article 2 – paragraph 1 – point q
(q) ‘terms and conditions’ means all terms and conditions or specifications by the service provider, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services, and are unilaterally determined by the provider of online intermediary services and that unilateral determination of terms and conditions is being evaluated on the basis of an overall assessment for which the relative size of the parties concerned, the fact that a negotiation took place, or that certain provisions thereof might have been subject to such a negotiation and determined together by the relevant provider and recipient are not, in themselves, decisive; or the rules laid down by the intermediary service provider under which users will be allowed to use the intermediation service concerned.
Amendment 404 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of entering, in good faith, an out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies located in either the Member State of the recipient or the provider and that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 406 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(q a) ‘dark pattern’ means a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision- making or choice.
Amendment 408 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
Article 2 – paragraph 1 – point q b (new)
(q b) ‘deep fake’ means a generated or manipulated image, audio or video content that appreciably resembles existing persons,objects, places or other entities or events and falsely appears to a person to be authentic or truthful;
Amendment 409 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
Article 2 – paragraph 1 – point q b (new)
(q b) 'minor' means a child below the age of 16, as established in Regulation (EU) 2016/679.
Amendment 410 #
Proposal for a regulation
Article 2 – paragraph 1 – point q c (new)
Article 2 – paragraph 1 – point q c (new)
(q c) ‘persons with disabilities’ means persons with disabilities within the meaning of Article 3(1) of Directive (EU) 2019/882
Amendment 412 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should normally only be awarded to non-governmental entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities, however, can be public in nature for actions not related to intellectual property rights, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, non- governmental organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 422 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and effective safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 423 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and effective safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 429 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, notably when it concerns vulnerable users, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 430 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 430 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life, health or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 433 #
Proposal for a regulation
Recital 48 a (new)
Recital 48 a (new)
(48a) Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it should remove or disable the content and promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all available relevant information.
Amendment 435 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with tradermarketplaces should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platformmarketplaces should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary and no longer than six months after the end of a relationship with the trader, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate direct interest, including through the orders to provide information referred to in this Regulation.
Amendment 437 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with tradermarketplaces should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platformproviders of online marketplaces, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 440 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. This Regulation shall not prevent providers from offering end- to-end encrypted services. The provision of such services shall not constitute a reason for liability or for becoming ineligible for the exemptions from liability.
Amendment 441 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platformproviders of online marketplaces covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platformproviders of online marketplaces covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platformproviders, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platformProviders of online marketplaces should also design and organise their online interface in a user- friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . The online interface should allow traders to provide the information referred to in Article 22a of this Regulation, the information referred to in Article 6 of Directive 2011/83/EU on Consumers Rights, information on sustainability of products, and information allowing for the unequivocal identification of the product or the service, including labelling requirements, in compliance with legislation on product safety and product compliance. __________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 443 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
Providers of intermediary services shall not be obliged to use automated tools for content moderation
Amendment 445 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific individual item of illegal content, received from and issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, including the Charter of Fundamental Rights of the European Union, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 447 #
Proposal for a regulation
Article 8 – paragraph 1 a (new)
Article 8 – paragraph 1 a (new)
1 a. If the provider cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that has issued the order.
Amendment 448 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
(50a) The online interface of online marketplace should allow traders to provide the information referred to in Article 22a of this Regulation and any other information where needed and necessary to allow for the unequivocal identification of the product or the service, including labelling requirements, in compliance with legislation on product safety and product compliance. Providers of online marketplaces, when they become aware that a product or services is illegal, should inform recipients who have acquired the product or services through their marketplace of this fact and any possible redress.
Amendment 448 #
Proposal for a regulation
Article 8 – paragraph 1 b (new)
Article 8 – paragraph 1 b (new)
1 b. Where the provider does not have its main establishment or legal representative in the Member State of the competent authority that has issued the order and the provider believes that the implementation of an order issued under paragraph 1 would infringe the Charter of Fundamental rights of the European Union, Union law, or the national law of the Member State in which the main establishment or legal representative of the provider is located, or does not meet the conditions of paragraph 2, the provider shall have the right to submit a reasoned request for a decision of the Digital Services Coordinator from the Member State of establishment. The provider shall inform the authority issuing the order of this submission.
Amendment 449 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
(50a) Providers of online marketplaces should demonstrate their best efforts to prevent the dissemination by traders of illegal products and services. In compliance with the no general monitoring principle, providers should inform recipients when the service or product they have acquired through their services is illegal. Once notified of an illegal product or service as foreseen in Article 14, providers of online marketplaces should take effective and proportionate measures to prevent such products or services from reappearing on their online marketplace.
Amendment 449 #
1 c. Upon receiving such a submission, the Digital Services Coordinator shall in a timely manner scrutinise the order and inform the provider of its decision. Where the Digital Services Coordinator agrees with the reasoning of the provider, in whole or in part, the Digital Services Coordinator shall inform, without undue delay, the Digital Services Coordinator of the Member State of the judicial or administrative authority issuing the order of its objection. The Digital Services Coordinator may choose to intervene on behalf of the provider in any redress, appeal or other legal processes in relations to the order.
Amendment 450 #
Proposal for a regulation
Article 8 – paragraph 1 d (new)
Article 8 – paragraph 1 d (new)
1 d. Until an objection under paragraph 1, point (c) is withdrawn, any penalties, fines or other sanctions related to the non-implementation of an order issued by the relevant national judicial or administrative authorities shall be suspended and the order shall cease to have legal effects.
Amendment 451 #
1 e. Paragraphs 1b and 1c shall not apply in the case of very large online platforms or where a content is manifestly illegal under Union law.
Amendment 453 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
Article 8 – paragraph 2 – point a – indent 1 a (new)
- the identification of the issuing authority and the means to verify the authentication of the order;
Amendment 456 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 457 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 459 #
Proposal for a regulation
Article 8 – paragraph 2 – point b
Article 8 – paragraph 2 – point b
(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective and in any case does not exceed the territory of the Member State of the order;
Amendment 464 #
Proposal for a regulation
Recital 52 b (new)
Recital 52 b (new)
(52b) Providers of public interest journalism should be identified through voluntary, self-regulatory European standards or European standardisation deliverables as defined by Regulation (EU) No 1025/2012 of the European Parliament and of the Council ('technical standards'), transparently developed, governed and enforced and such standards must be based on internationally accepted best-practices and ethical norms;
Amendment 464 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider.
Amendment 467 #
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
Article 8 – paragraph 2 – point c a (new)
(c a) the order is issued only where no other effective means are available to bring about the cessation or the prohibition of the infringement
Amendment 468 #
Proposal for a regulation
Article 8 – paragraph 2 – point c b (new)
Article 8 – paragraph 2 – point c b (new)
(c b) where more than one provider of intermediary services is responsible for hosting the specific item, the order is issued to the most appropriate provider that has the technical and operational ability to act against the specific item.
Amendment 469 #
Proposal for a regulation
Article 8 – paragraph 2 a (new)
Article 8 – paragraph 2 a (new)
2 a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders.
Amendment 470 #
Proposal for a regulation
Recital 54
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The determination of this operational threshold, therefore, should only take into those recipients which are physical persons residing in the Union or physical persons acting on behalf of a legal person established in the Union. Automated bots, fake accounts, indirect hyperlinking, FTP or other indirect downloading of content should not be included in the determination of this threshold being exceed. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
Amendment 470 #
Proposal for a regulation
Article 8 – paragraph 2 b (new)
Article 8 – paragraph 2 b (new)
2 b. Member States shall ensure that providers have a right to appeal and object to implementing the order and shall facilitate the use and access to that right.
Amendment 471 #
Proposal for a regulation
Article 8 – paragraph 2 c (new)
Article 8 – paragraph 2 c (new)
2 c. When an order to act against a specific individual item of illegal content is issued by a relevant national judicial or administrative authority, Member States shall ensure that the relevant national judicial or administrative authority duly informs the Digital Services Coordinator from the Member State of the judicial or administrative authority.
Amendment 474 #
Proposal for a regulation
Article 8 – paragraph 3 – subparagraph 1 (new)
Article 8 – paragraph 3 – subparagraph 1 (new)
Where upon receiving the copy of the order, at least three Digital Services Coordinators consider that the order violates Union or national law that is in conformity with Union law, including the Charter, they can object the enforcement of the order to the Board, based on a reasoned statement. Following recommendation of the Board, the Commission may decide whether the order is to be enforced.
Amendment 476 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products or the display of copyright-infringing content. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition, hampering competition or the way platforms' terms and conditions including content moderation policies, are enforced, including through automatic means. With respect to this category of risks, a particular attention should be paid to the detrimental effect of intimidation of independent press and the harassment of journalists, in particular women who are more often victims of hateful speech and online threats. These should be considered systemic risk as referred to in Article 26 as they pose threat to democratic values, media freedom, freedom of expression and information, and should be subject to dedicated mitigating measures as referred to in Article 27, and priority notice through trusted flaggers as referred to in Article 19. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, fundamental rights, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 477 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition or misuse the way platforms' terms and conditions, including content moderation policies, are enforced, often through automatic means. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, fundamental rights, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 478 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service through the submission of abusive notices, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 478 #
Proposal for a regulation
Article 8 – paragraph 4
Article 8 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative law in conformity with Union law, including the Charter of Fundamental Rights. While acting in accordance with such laws, authorities shall not go beyond what is necessary in order to attain the objectives followed therein.
Amendment 482 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditionprevent the manipulation and exploitation of the service, including by the amplification of content which is counter to their terms and conditions, adapting their decision-making processes, or adapting their terms and conditions and content moderation policies and how those policies are enforced, while being fully transparent to the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources, including by displaying related public service advertisements instead of other commercial advertisements. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 488 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
Amendment 489 #
Proposal for a regulation
Article 9 – paragraph 1 a (new)
Article 9 – paragraph 1 a (new)
1 a. If the provider cannot comply with the information order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that issued the information order
Amendment 490 #
Proposal for a regulation
Article 9 – paragraph 1 b (new)
Article 9 – paragraph 1 b (new)
1 b. Where the provider does not have its main establishment or legal representative in the Member State of the competent authority that issued the order and a provider believes that the implementation of an order issued under paragraph 1 would infringe the Charter, Union law, or the national law of the Member State in which the main establishment or legal representative of the provider is located, or does not meet the conditions of paragraph 2, the provider shall have the right to submit a reasoned request for a decision of the Digital Services Coordinator from the Member State of establishment. The provider shall inform the authority issuing the order of this submission.
Amendment 491 #
Proposal for a regulation
Article 9 – paragraph 1 c (new)
Article 9 – paragraph 1 c (new)
1 c. Upon receiving such a submission, the Digital Services Coordinator shall in a timely manner scrutinise the order and inform the provider of its decision. Where the Digital Services Coordinator agrees with the reasoning of the provider, in whole or in part, the Digital Services Coordinator shall inform of its objection, without undue delay, the Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order. The Digital Services Coordinator may choose to intervene on behalf of the provider in any redress, appeal or other legal processes in relations to the order.
Amendment 492 #
Proposal for a regulation
Article 9 – paragraph 1 d (new)
Article 9 – paragraph 1 d (new)
1 d. Until an objection under paragraph 1, point (c) is withdrawn, any penalties, fines or other sanctions related to the non-implementation of an order issued by the relevant national judicial or administrative authorities shall be suspended and the order shall cease to have legal effects.
Amendment 495 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
Article 9 – paragraph 2 – point a – indent -1 (new)
-1 the identification of the issuing authority and the means to verify the authentication of the order;
Amendment 499 #
Proposal for a regulation
Recital 63
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. In addition, very large online platforms should label any known deep fake videos, audio or other files.
Amendment 501 #
Proposal for a regulation
Recital 63 a (new)
Recital 63 a (new)
(63a) The practice of very large online platforms to associate advertisement with content uploaded by users, could indirectly lead to the promotion of illegal content, or content that is in breach of their terms and conditions and could risk to considerably damage the brand image of the buyers of advertising space. In order to prevent such practice, the very large online platforms should ensure, including through standard contractual guarantees to the buyers of advertising space, that the content to which they associate advertisements is legal, and compliant with their terms and conditions. Furthermore, the very large online platforms should allow advertisers to have access to the results of audits carried out independently and evaluating the commitments and tools of platforms for protecting the brand image of the buyers of advertising space ("brand safety").
Amendment 503 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms, such as the dissemination of illegal and amplification of harmful content brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint- handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 504 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, which mean the conditions set down in this Regulation. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 505 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider;
Amendment 506 #
Proposal for a regulation
Article 9 – paragraph 2 – point c a (new)
Article 9 – paragraph 2 – point c a (new)
(c a) the order is issued only where no other effective means are available to receive the same specific item of information
Amendment 507 #
Proposal for a regulation
Article 9 – paragraph 2 a (new)
Article 9 – paragraph 2 a (new)
2 a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders. It shall ensure that form means the standards set down in the Annex of [XXX the regulation on European Production and Preservation Orders for electronic evidence in criminal matters].
Amendment 508 #
Proposal for a regulation
Article 9 – paragraph 2 b (new)
Article 9 – paragraph 2 b (new)
Amendment 509 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65a) Any change on the recommender systems used by platforms to suggest, rank and prioritise information can have a dramatic impact on the users, in particular on the media that widely rely on platforms to be accessible to their audience; consequently, providers of online platforms should be transparent about any changes operated in their referencing and recommendation rules, even if made on an experimental basis, and immediately inform the regulators, their users and the authors of referenced content, allowing these changes to be predictable to those affected by them; users should be able to refer to the regulator asking it to give its opinion on the negative impact of changes in the referencing and recommendation rules, allowing it to require the platform to remedy this impact.
Amendment 512 #
Proposal for a regulation
Article 9 – paragraph 4
Article 9 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative law in conformity with Union law.
Amendment 513 #
Proposal for a regulation
Recital 66
Recital 66
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as standardised disclosure frames for advertising, as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate.
Amendment 516 #
Proposal for a regulation
Recital 67
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation, as well as the compliance of online platforms with the provisions of these codes. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
Amendment 519 #
Proposal for a regulation
Article -10 (new)
Article -10 (new)
Amendment 521 #
Proposal for a regulation
Article -10 a (new)
Article -10 a (new)
Article -10 a Conflict between Union Acts 1. Where any obligation set down in this Regulation can be viewed as equivalent with or superseded by an obligation within another Union act, in which a provider of intermediary services is also a subject, a provider of intermediary services may apply to the Commission for a waiver from such requirements or a declaration that it should be deemed as having complied with this Regulation, in whole or in parts. The provider shall present justified reasons for their request. 2. The Commission shall examine such an application and, after consulting the Board, may issue a waiver or declaration in whole or in parts to the requirements of this Regulation. 3. Upon the request of the Board or on its own initiative, the Commission may review a waiver or declaration issued and revoke the waiver or declaration in whole or in parts. 4. The Commission shall maintain a list of all waivers and declarations issued and their conditions and shall publish this list to the public.
Amendment 523 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
Article 10 – paragraph 2 a (new)
2 a. Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision.
Amendment 526 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of theeach Member States where the provider offers its services.
Amendment 527 #
Proposal for a regulation
Recital 69
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, since the Commission willhas issued guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan in May 2021.
Amendment 528 #
Proposal for a regulation
Recital 69
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, since the Commission willhas issued guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan in May 2021.
Amendment 529 #
Proposal for a regulation
Recital 70
Recital 70
Amendment 535 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format, in a searchable archive of all the previous versions with their date of application.
Amendment 539 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
1 a. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, and available remedies including applicable alternative dispute resolution mechanisms. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions, including information on the available remedies and the possibilities for opt-out, where relevant.
Amendment 541 #
Proposal for a regulation
Recital 76
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV and Article 8 and 9 by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.
Amendment 542 #
Proposal for a regulation
Recital 78
Recital 78
(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. In order to ensure coherence between the Member States, the Commission should adopt guidance on the procedures and rules related to the powers of Digital Services Coordinators.
Amendment 546 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Providers of intermediary services shall, when complying with the requirements of this Article, not be required to disclose algorithms or any information that, with reasonable certainty, would result in the enabling of deception of consumers or consumer harm through the manipulation of their services. This Article shall be without prejudice to Directive (EU) 2016/943.
Amendment 549 #
Proposal for a regulation
Recital 85
Recital 85
(85) Where a Digital Services Coordinator requests another Digital Services Coordinator to take action, the requesting Digital Services Coordinator, or the Board in case it issued a recommendation to assess issues involving more than threefour Member States, should be able to refer the matter to the Commission in case of any disagreement as to the assessments or the measures taken or proposed or a failure to adopt any measures. TIf the Commission believes that the Digital Services Coordinator of establishment has not at least partially addressed the request or has not fully justified its decision to not address the request, the Commission, on the basis of the information made available by the concerned authorities, should accordingly be able to request the competent Digital Services Coordinator to re-assess the matter and take the necessary measures to ensure compliance within a defined time period. This possibility is without prejudice to the Commission’s general duty to oversee the application of, and where necessary enforce, Union law under the control of the Court of Justice of the European Union in accordance with the Treaties. A failure by the Digital Services Coordinator of establishment to take any measures pursuant to such a request may also lead to the Commission’s intervention under Section 3 of Chapter IV of this Regulation, where the suspected infringer is a very large online platform.
Amendment 551 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2b. Providers of intermediary services shall refrain from any dark patterns or other techniques to encourage the acceptance of terms and conditions, including giving consent to sharing personal and non-personal data.
Amendment 552 #
Proposal for a regulation
Recital 86
Recital 86
(86) In order to facilitate cross-border supervision and investigations involving several Member States, the Digital Services Coordinators should be able to participate, on a permanent or temporary basis, in joint oversight and investigation activities concerning matters covered by this Regulation and under the authority of the Digital Services Coordinator of the Member State of establishment. Those activities may include other competent authorities and may cover a variety of issues, ranging from coordinated data gathering exercises to requests for information or inspections of premises, within the limits and scope of powers available to each participating authority. The Board may be requested to provide advice in relation to those activities, for example by proposing roadmaps and timelines for activities or proposing ad-hoc task-forces with participation of the authorities involved.
Amendment 554 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall not require recipients of the service other than traders to make their legal identity public in order to use the service.
Amendment 556 #
Proposal for a regulation
Article 12 – paragraph 2 d (new)
Article 12 – paragraph 2 d (new)
2d. For providers other than very large online platforms, nothing in this Regulation shall prevent a provider of intermediary services from terminating the contractual relationship with its recipients without clause, in the situations provided for in the terms and conditions. Providers of a very large online platform shall issue a statement for the termination to the recipient, and the recipient shall have access to the internal complaint mechanism under Article 17 and the out- of-court mechanism under Article 18.
Amendment 557 #
Proposal for a regulation
Recital 88
Recital 88
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level and with legal personality, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member Statethe Member State´s Digital Services Coordinator.
Amendment 558 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Amendment 559 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Amendment 560 #
Proposal for a regulation
Recital 89
Recital 89
(89) The Board should contribute to achieving a common Union perspective on the consistent application of this Regulation and to cooperation among competent authorities, including by advising the Commission and the Digital Services Coordinators about appropriate investigation and enforcement measures, in particular vis à vis very large online platforms. The Board should also contribute to the drafting of relevant templates and, codes of conduct, best practices and analyse emerging general trends in the development of digital services in the Union.
Amendment 560 #
Proposal for a regulation
Article 12 b (new)
Article 12 b (new)
Amendment 561 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily accessible, comprehensible, and detailed reports on any content moderation they engaged in during the relevant period. The reports shall be available in searchable archives. Those reports shall include, in particular, information on the following, as applicable:
Amendment 563 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed to inform taking the action specified in thoshe authority issuing the order of its receipt and the effect given to the orders;
Amendment 566 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, market surveillance, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 573 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
Article 13 – paragraph 1 a (new)
1a. The information provided shall be broken down per Member State in which services are offered and in the Union as a whole.
Amendment 578 #
Proposal for a regulation
Recital 97 a (new)
Recital 97 a (new)
(97a) The Commission should ensure that it is independent and impartial in its decision making in regards to both Digital Services Coordinators and providers of services under this Regulation.
Amendment 578 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 and 1a shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 579 #
Proposal for a regulation
Recital 98
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the Commission should have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties, including the right to challenge any investigative requests before a judicial authority within the Member State of establishment.
Amendment 579 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
Article 13 – paragraph 2 a (new)
2a. Where made available to the public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider's terms and conditions.
Amendment 585 #
Proposal for a regulation
Recital 99
Recital 99
(99) In particular, the Commission, where it can show grounds for believing that a very large online platform is not compliant with this Regulation, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information related to those concerns. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers.
Amendment 587 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content, or content that is in breach with their terms and conditions. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means. and may include: (a) a clearly identifiable banner or single reporting button, allowing users to notify quickly and easily the providers of these services of illegal content they have encountered; (b) providing information to the users on what is considered illegal content under Union and national law; (c) providing information to the users on available national public tools to signal illegal content to the competent authorities.
Amendment 590 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality or the breach of the content in question with the terms and conditions. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 592 #
Proposal for a regulation
Recital 102
Recital 102
(102) In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within five years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure. In addition, the Commission should carry out an assessment of any impact of the costs to European service providers of any similar requirements, including those of Article 11, introduced by third-party states and any new barriers to non-EU market access after the adoption of this Regulation. The Commission should also access the impact on the ability of European businesses and consumers to access and buy products and services from outside the Union.
Amendment 592 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content, or content that is in breach with providers' terms and conditions;
Amendment 594 #
Proposal for a regulation
Recital 104
Recital 104
(104) In order to fulfil the objectives of this Regulation, the power to adopt acts in accordance with Article 290 of the Treaty should be delegated to the Commission to supplement this Regulation. In particular, delegated acts should be adopted in respect of criteria for identification of very large online platforms and of technical specifications for access requests. It is also equally important that when standardisation bodies are unable to agree the standards needed to implement this Regulation fully, that the Commission to choice to adopt delegated acts. It is of particular importance that the Commission carries out appropriate consultations and that those consultations be conducted in accordance with the principles laid down in the Interinstitutional Agreement on Better Law-Making of 13 April 2016. In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States' experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts.
Amendment 594 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particularfor example the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content, or content that is in breach with providers' terms and conditions;
Amendment 595 #
Proposal for a regulation
Recital 105 a (new)
Recital 105 a (new)
(105a) This Regulation serves a horizontal framework to ensure the further strengthening and deepening the Digital Single Market and the internal market and therefore seeks to lay down rules and obligations which, unless specified, seek to be applicable to all providers without regards to individual models of operation. Individual models of operation are often addressed in different Union law regarded as lex specialis. In the case of any potential conflict between this Regulation and those Union acts, the principle of Lex specialis derogat legi generali should apply.
Amendment 601 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1 a. The parameters referred to in paragraph 1 shall include, at a minimum: (a) whether the recommender system is an automated system and, in that case, the identity of the natural or legal person responsible for the recommender system, if different from the platform provider; (b) clear information about the criteria used by recommender systems; (c) the relevance and weight of each criteria which leads to the information recommended; (e) what goals the relevant system has been optimised for, (d) if applicable, explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs.
Amendment 608 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniform harmonised rules for a safe, predictable, accessible and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 616 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non- discriminatory and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 617 #
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
Article 1 – paragraph 2 – point b a (new)
(ba) facilitate innovation, support digital transition, encourage economic growth and an investment climate to create a level playing field for digital services within the internal market that respect and promote fundamental rights enshrined in the Charter
Amendment 620 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Providers of hosting services, where they are equally deemed a sharing service providers according to Directive (EU) 2019/790, shall in the case of a conflict of law, apply Directive (EU) 2019/790 as superseding this Article.
Amendment 622 #
Proposal for a regulation
Article 1 – paragraph 3
Article 1 – paragraph 3
3. This Regulation shall apply to intermediary services directed at and provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services.
Amendment 625 #
Proposal for a regulation
Article 1 – paragraph 5 – point b
Article 1 – paragraph 5 – point b
(b) Directive 2010/13/ECU as amended by Directive 2018/1808/EU;
Amendment 631 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
Article 1 – paragraph 5 – point c
(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 on Copyright and Related Rights in Digital Single Market;
Amendment 634 #
Proposal for a regulation
Article 1 – paragraph 5 – point h
Article 1 – paragraph 5 – point h
(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation (EU) 2019/1020 and Regulation XXX (General Product Safety Regulation);
Amendment 638 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU) 2019/882
Amendment 642 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
Article 1 – paragraph 5 a (new)
5a. The Commission shall by [within one year of the adoption of this Regulation] publish guidelines with regards to the relations between this Regulation and those legislative acts listed in Article 1(5). These guidelines shall clarify any potential conflicts between the conditions and obligations enlisted in these legislative acts and which act prevails where actions, in line with this Regulation, fulfil the obligations of another legislative act and which regulatory authority is competent.
Amendment 647 #
Proposal for a regulation
Article 2 – paragraph 1 – point b
Article 2 – paragraph 1 – point b
(b) ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary servic, for professional reasons or otherwise, uses the relevant intermediary service in particular for the purposes of seeking information or making it accessible;
Amendment 650 #
Proposal for a regulation
Article 2 – paragraph 1 – point c
Article 2 – paragraph 1 – point c
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;
Amendment 650 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service and individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint- handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the provider of the online platform not to act upon the receipt of a notice or against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 654 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions whether or not to remove or disable access to or restrict visibility of the information;
Amendment 658 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
Amendment 660 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions whether or not to suspend or terminate the recipients’ account.
Amendment 662 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) decisions whether or not to restrict the ability to monetize content provided by the recipients;
Amendment 664 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
Article 2 – paragraph 1 – point d – indent 2
— the targeproactive directing of activities towards one or more Member States.
Amendment 667 #
Proposal for a regulation
Article 2 – paragraph 1 – point e
Article 2 – paragraph 1 – point e
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession, irrespective of the legality of those actions;
Amendment 667 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
Article 17 – paragraph 1 – point c b (new)
(cb) decisions whether or not to apply labels or additional information on content.
Amendment 670 #
Proposal for a regulation
Article 17 – paragraph 1 a (new)
Article 17 – paragraph 1 a (new)
1a. When the decision to remove or disable access to the information is followed by the transmission of this information in accordance with Article 15a, the period of at least six months as set out in paragraph 1 shall be considered to start from the day on which the recipient was informed in accordance with Article 15(2).
Amendment 673 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 1
Article 2 – paragraph 1 – point f – indent 1
— a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, including technical auxiliary functional services;
Amendment 675 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that recipients of the service are given the possibility, where necessary, to contact a human interlocutor at the time of the submission of the complaint and that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means.
Amendment 677 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3 a (new)
Article 2 – paragraph 1 – point f – indent 3 a (new)
- an ‘online search engine’ as defined in point (5) of Article 2 of Regulation (EU) 2019/1150;
Amendment 679 #
Proposal for a regulation
Article 2 – paragraph 1 – point f a (new)
Article 2 – paragraph 1 – point f a (new)
(fa) live streaming platform services shall be defined as information society services of which the main or one of the main purposes is to give the public access to audio or video material that is live broadcasted by its users, which it organises and promotes for profit-making purposes;
Amendment 679 #
Proposal for a regulation
Article 18 – paragraph 1 – introductory part
Article 18 – paragraph 1 – introductory part
1. Recipients of the service individuals or entities that have submitted notices, addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of- court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 680 #
Proposal for a regulation
Article 2 – paragraph 1 – point f b (new)
Article 2 – paragraph 1 – point f b (new)
(fb) private messaging services shall be defined as number-independent interpersonal communications services as defined in Article 2(7) of Directive (EU) 2018/1972, excluding transmission of electronic mail as defined in Article 2 (h) of Directive 2002/58/EC;
Amendment 682 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
The first subparagraph is without prejudice to the right of the recipient or individuals or entities that have submitted notices, concerned to redress against the decision before a court in accordance with the applicable law.
Amendment 683 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
Article 18 – paragraph 1 a (new)
1a. Where a recipient seeks a resolved to multiple complaints, either party may request that the out-of-court dispute settlement body treats and resolves these complaints in a single dispute decision.
Amendment 688 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and of individuals or entities that have submitted notices;
Amendment 692 #
Proposal for a regulation
Article 18 – paragraph 2 – point c
Article 18 – paragraph 2 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
Amendment 694 #
Proposal for a regulation
Article 18 – paragraph 2 – point d
Article 18 – paragraph 2 – point d
(d) it is capable of settling dispute in a swift, efficient, accessible for persons with disabilities and cost-effective manner and in at least one official language of the Union;
Amendment 695 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, with which it has a direct relationship stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. For the purpose of this Regulation, cloud computing service shall not be considered to be an online platform in cases where allowing the dissemination of hyperlinks to a specific content constitutes a minor and ancillary feature.
Amendment 702 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
Article 2 – paragraph 1 – point h a (new)
(ha) ‘very large online platform’ means a provider of a hosting service which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3;
Amendment 704 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
Article 2 – paragraph 1 – point h a (new)
(ha) ‘online marketplace’ means an online platform that allows consumers to conclude distance contracts with other traders or consumers on their platform;
Amendment 705 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
Article 2 – paragraph 1 – point h a (new)
(ha) ‘cloud computing service’ means a digital service that enables access to a scalable and elastic pool of shareable computing resources;
Amendment 714 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
Article 2 – paragraph 1 – point i a (new)
(ia) ‘online marketplace’ means an online platform which allows consumers to conclude distance contracts with traders on its platform;
Amendment 716 #
Proposal for a regulation
Article 2 – paragraph 1 – point k a (new)
Article 2 – paragraph 1 – point k a (new)
(ka) ‘trusted flagger’ means an entity that has been nominated by a Digital Services Coordinator based on specific conditions to be authorised to issue priority notifications as to illegal content found on a platform.
Amendment 718 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether the person is incorporated or unincorporated and irrespective of whether the information is designed to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface normally against remuneration specifically for promoting that informationmessage;
Amendment 719 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
Amendment 724 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities, either through automated or manual means, undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, monetisation and accessibility of that illegal content or that information, such as demotion, disabling of access to, delisting, demonetisation or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
Amendment 726 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
Article 19 – paragraph 2 – point c a (new)
(ca) it has a transparent funding structure, including publishing the sources and amounts of all revenue annually
Amendment 727 #
Proposal for a regulation
Article 19 – paragraph 2 – point c b (new)
Article 19 – paragraph 2 – point c b (new)
(cb) it is not already a trusted flagger in another Member State.
Amendment 728 #
Proposal for a regulation
Article 19 – paragraph 2 – point c c (new)
Article 19 – paragraph 2 – point c c (new)
Amendment 730 #
Proposal for a regulation
Article 2 – paragraph 1 – point p a (new)
Article 2 – paragraph 1 – point p a (new)
(pa) ‘deep fake’ means a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful;
Amendment 731 #
Proposal for a regulation
Article 2 – paragraph 1 – point q
Article 2 – paragraph 1 – point q
(q) ‘terms and conditions’ means all terms and conditions or specifications by the service provider, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services, and are unilaterally determined by the provider of online intermediary services and that unilateral determination of terms and conditions is being evaluated on the basis of an overall assessment for which the relative size of the parties concerned, the fact that a negotiation took place, or that certain provisions thereof might have been subject to such a negotiation and determined together by the relevant provider and recipient are not, in themselves, decisive; or the rules laid down by the intermediary service provider under which users will be allowed to use the intermediation service concerned.
Amendment 735 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) 'dark patterns' means an online interface or a part thereof that via its structure, function or manner of operation subverts or impairs the autonomy, decisions-making, or choice of recipients of the service.
Amendment 738 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) ‘dark pattern’ means a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision- making or choice.
Amendment 742 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
Article 2 – paragraph 1 – point q b (new)
(qb) ‘persons with disabilities’ means persons with disabilities within the meaning of Article 3(1) of Directive (EU) 2019/882
Amendment 744 #
Proposal for a regulation
Article 19 a (new)
Article 19 a (new)
Amendment 745 #
Proposal for a regulation
Article 2 – paragraph 1 – point q c (new)
Article 2 – paragraph 1 – point q c (new)
(qc) ‘deep fake’ means a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful.
Amendment 748 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content, or content that is in breach with their terms and conditions.
Amendment 752 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
Amendment 753 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
Article 4 – paragraph 1 – point a
(a) the provider does not modify the finformational content;
Amendment 761 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
Article 20 – paragraph 3 a (new)
3a. Suspensions referred to in paragraphs 1 and 2 may be declared permanent where (a) compelling reasons of law or public policy, including ongoing criminal investigations, justify avoiding or postponing notice to the recipient; (b) the items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts; or (c) the items removed were related to content covered by [Directive 2011/93/EU updated reference] or [Directive (EU) 2017/541 XXX New Ref to TCO Regulation].
Amendment 770 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with tradermarketplaces, where such an online platformmarketplace presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
Amendment 772 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traderproviders of online marketplaces, where such an online platformmarketplace presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platformmarketplace itself or by a recipient of the service who is acting under its authority or control.
Amendment 773 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
Amendment 773 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
Article 21 – paragraph 2 a (new)
2a. Unless instructed otherwise by the informed authority, the provider shall remove or disable the content. It shall store all content and related data for at least six months.
Amendment 774 #
Proposal for a regulation
Article 21 – paragraph 2 b (new)
Article 21 – paragraph 2 b (new)
2b. Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified.
Amendment 775 #
Proposal for a regulation
Article 21 – paragraph 2 c (new)
Article 21 – paragraph 2 c (new)
2c. The Commission shall adopt an implementing act setting down a template for notifications under paragraph 1.
Amendment 776 #
Proposal for a regulation
Article 21 – paragraph 2 d (new)
Article 21 – paragraph 2 d (new)
2d. Where a notification of suspicions of criminal offences includes information which may be seen as potential electronic information in criminal proceedings, Regulation XXX [E-evidence] shall apply.
Amendment 781 #
Proposal for a regulation
Article 6
Article 6
Amendment 783 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
Article 22 – paragraph 1 – point c
(c) the bankpayment account details of the trader, where the trader is a natural person;
Amendment 784 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
Article 22 – paragraph 1 – point d
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning ofestablished in the Union and carrying out the tasks in accordance with Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or [Article XX of the General Product Safety Regulation], or any relevant act of Union law; _________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
Amendment 785 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including thoseor national law, in conformity with the Union law, including the EU Charter on Fundamental Rights, and the requirements set out in this Regulation.
Amendment 793 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. This Regulation shall not prevent providers from offering end- to-end encrypted services. The provision of such services shall not constitute a reason for liability or for becoming ineligible for the exemptions from liability.
Amendment 803 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union lawor national law, that is in conformity with Union law, including the EU Charter on Fundamental Rights, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 806 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific individual item of illegal content, received from and issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 809 #
Proposal for a regulation
Article 8 – paragraph 1 a (new)
Article 8 – paragraph 1 a (new)
1a. If the provider cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that has issued the order.
Amendment 810 #
Proposal for a regulation
Article 8 – paragraph 1 b (new)
Article 8 – paragraph 1 b (new)
1b. Where the provider does not have its main establishment or legal representative in the Member State of the competent authority that has issued the order and the provider believes that the implementation of an order issued under paragraph 1 would infringe the Charter of Fundamental rights of the European Union, Union law, or the national law of the Member State in which the main establishment or legal representative of the provider is located, or does not meet the conditions of paragraph 2, the provider shall have the right to submit a reasoned request for a decision of the Digital Services Coordinator from the Member State of establishment. The provider shall inform the authority issuing the order of this submission.
Amendment 811 #
Proposal for a regulation
Article 8 – paragraph 1 c (new)
Article 8 – paragraph 1 c (new)
1c. Upon receiving such a submission, the Digital Services Coordinator shall in a timely manner scrutinise the order and inform the provider of its decision. Where the Digital Services Coordinator agrees with the reasoning of the provider, in whole or in part, the Digital Services Coordinator shall inform, without undue delay, the Digital Services Coordinator of the Member State of the judicial or administrative authority issuing the order of its objection. The Digital Services Coordinator may choose to intervene on behalf of the provider in any redress, appeal or other legal processes in relations to the order.
Amendment 812 #
Proposal for a regulation
Article 8 – paragraph 1 d (new)
Article 8 – paragraph 1 d (new)
1d. Until an objection under paragraph 1, point (c) is withdrawn, any penalties, fines or other sanctions related to the non-implementation of an order issued by the relevant national judicial or administrative authorities shall be suspended and the order shall cease to have legal effects.
Amendment 813 #
Proposal for a regulation
Article 8 – paragraph 1 e (new)
Article 8 – paragraph 1 e (new)
1e. Paragraphs 1b and 1c shall not apply in the case of very large online platforms or where a content is manifestly illegal under Union law.
Amendment 816 #
Proposal for a regulation
Article 23 – paragraph 1 – point c a (new)
Article 23 – paragraph 1 – point c a (new)
(ca) the number of advertisements that were removed, labelled or disabled by the online platform and justification of the decisions;
Amendment 819 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
Article 8 – paragraph 2 – point a – indent 1 a (new)
— the identification of the issuing authority and the means to verify the authentication of the order;
Amendment 820 #
Proposal for a regulation
Article 23 – paragraph 4 a (new)
Article 23 – paragraph 4 a (new)
4a. Where published to the general public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions.
Amendment 821 #
Proposal for a regulation
Article 23 a (new)
Article 23 a (new)
Amendment 824 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Article 24 – paragraph 1 – introductory part
Online platforms that directly and indirectly display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, meaningful, salient, uniform and unambiguous manner and in real time:
Amendment 826 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person or group on whose behalf the advertisement is displayed and the natural or legal person who financed the advertisement;
Amendment 828 #
Proposal for a regulation
Article 8 – paragraph 2 – point b
Article 8 – paragraph 2 – point b
(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective and in any case does not exceed the territory of the Member State of the order;
Amendment 828 #
(c) clear, meaningful and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed. and the logic involved;
Amendment 829 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
Article 24 – paragraph 1 – point c a (new)
(ca) whether the advertisement was selected using an automated mechanism. such as ad exchange mechanisms, and if so, the identity of the natural or legal person responsible for the system;
Amendment 830 #
Proposal for a regulation
Article 24 – paragraph 1 – point c b (new)
Article 24 – paragraph 1 – point c b (new)
(cb) if the online platform uses automated systems to determine the recipients of the service to whom the advertisement should be displayed, meaningful information about the reasons why a given advertisement has been deemed relevant for a specific recipient of the service.
Amendment 833 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
Online platforms that suggest advertised content to which the recipients of the service have not explicitly looked for or subscribed to shall ensure that the recipients of the service can identify, for each specific suggestion, in a clear and unambiguous manner and in real time, meaningful information about the criteria used to suggest this content to the recipient, including, where applicable, personal data of the recipient taken into account pursuant to Article XY.
Amendment 834 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Article 24 – paragraph 1 b (new)
Providers of online platforms shall, by default, not make the recipients of their service subject to behavioural and micro- targeted advertisements unless the recipients of the service has expressed a freely given, specific, informed and unambiguous consent in the line with the requirements under Regulation (EU) 2016/679 and article 12(2b). Providers of online platforms shall ensure this requirements applies to previous choices expressed by individual recipients of the service.
Amendment 835 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider.
Amendment 836 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider.
Amendment 837 #
Proposal for a regulation
Article 24 – paragraph 1 c (new)
Article 24 – paragraph 1 c (new)
Providers of online platforms shall provide individual recipients of the service the possibility to modify or influence the parameters used to display advertisements to the recipient of the service. The default parameters shall be the most respectful and protective possible towards the rights of consumers.
Amendment 838 #
Proposal for a regulation
Article 24 – paragraph 1 d (new)
Article 24 – paragraph 1 d (new)
Online platforms shall also build special protections for individual recipients of the service below the age of 16 to limit their exposure to advertising. Advertisements that are targeted or micro targeted toward individuals or segments of individuals who are below the age of 18 on the basis of their personal data, behaviour, the tracking of their activities or profiling within the meaning of Article 4(4) of Regulation (EU) 2016/679 shall not be permitted.
Amendment 840 #
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
Article 8 – paragraph 2 – point c a (new)
(ca) the order is issued only where no other effective means are available to bring about the cessation or the prohibition of the infringement
Amendment 840 #
Proposal for a regulation
Article 24 – paragraph 1 e (new)
Article 24 – paragraph 1 e (new)
Amendment 841 #
Proposal for a regulation
Article 8 – paragraph 2 – point c b (new)
Article 8 – paragraph 2 – point c b (new)
(cb) where more than one provider of intermediary services is responsible for hosting the specific item, the order is issued to the most appropriate provider that has the technical and operational ability to act against the specific item.
Amendment 842 #
Proposal for a regulation
Article 8 – paragraph 2 a (new)
Article 8 – paragraph 2 a (new)
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders.
Amendment 843 #
Proposal for a regulation
Article 8 – paragraph 2 b (new)
Article 8 – paragraph 2 b (new)
2b. Member States shall ensure that providers have a right to appeal and object to implementing the order and shall facilitate the use and access to that right.
Amendment 844 #
Proposal for a regulation
Article 8 – paragraph 2 c (new)
Article 8 – paragraph 2 c (new)
2c. When an order to act against a specific individual item of illegal content is issued by a relevant national judicial or administrative authority, Member States shall ensure that the relevant national judicial or administrative authority duly informs the Digital Services Coordinator from the Member State of the judicial or administrative authority.
Amendment 846 #
Proposal for a regulation
Article 8 – paragraph 3 – subparagraph 1 a (new)
Article 8 – paragraph 3 – subparagraph 1 a (new)
Where upon receiving the copy of the order, at least three Digital Services Coordinators consider that the order violates Union or national law that is in conformity with Union law, including the Charter, they can object the enforcement of the order to the Board, based on a reasoned statement. Following recommendation of the Board, the Commission may decide whether the order is to be enforced.
Amendment 849 #
Proposal for a regulation
Article 8 – paragraph 4
Article 8 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative law in conformity with Union law, including the Charter on Fundamental Rights. While acting in accordance with such laws, authorities shall not go beyond what is necessary in order to attain the objectives followed therein.
Amendment 850 #
Proposal for a regulation
Article 8 – paragraph 4
Article 8 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law, including the EU Charter on Fundamental Rights.
Amendment 856 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 858 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. Where no effect has been given to the order, providers of intermediary services shall provide without delay the authority of issuing the order with a statement of reasons as to why the order was not given an effect.
Amendment 860 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
(a) the dissemination of illegal content and content that is in breach of their terms and conditions through their services;,
Amendment 862 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
Amendment 864 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
Amendment 865 #
Proposal for a regulation
Article 9 – paragraph 1 a (new)
Article 9 – paragraph 1 a (new)
1a. If the provider cannot comply with the information order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that issued the information order
Amendment 866 #
Proposal for a regulation
Article 9 – paragraph 1 b (new)
Article 9 – paragraph 1 b (new)
1b. Where the provider does not have its main establishment or legal representative in the Member State of the competent authority that issued the order and a provider believes that the implementation of an order issued under paragraph 1 would infringe the Charter, Union law, or the national law of the Member State in which the main establishment or legal representative of the provider is located, or does not meet the conditions of paragraph 2, the provider shall have the right to submit a reasoned request for a decision of the Digital Services Coordinator from the Member State of establishment. The provider shall inform the authority issuing the order of this submission.
Amendment 867 #
Proposal for a regulation
Article 9 – paragraph 1 c (new)
Article 9 – paragraph 1 c (new)
1c. Upon receiving such a submission, the Digital Services Coordinator shall in a timely manner scrutinise the order and inform the provider of its decision. Where the Digital Services Coordinator agrees with the reasoning of the provider, in whole or in part, the Digital Services Coordinator shall inform of its objection, without undue delay, the Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order. The Digital Services Coordinator may choose to intervene on behalf of the provider in any redress, appeal or other legal processes in relations to the order.
Amendment 868 #
Proposal for a regulation
Article 9 – paragraph 1 d (new)
Article 9 – paragraph 1 d (new)
1d. Until an objection under paragraph 1, point (c) is withdrawn, any penalties, fines or other sanctions related to the non-implementation of an order issued by the relevant national judicial or administrative authorities shall be suspended and the order shall cease to have legal effects.
Amendment 869 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service and amplification of content that is in breach of their terms and conditions, including by means of inauthentic use, such as ‘deep fakes’ or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, democratic values, media freedom and freedom of expression of journalists, as well as their ability to verify facts, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 870 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
Article 9 – paragraph 2 – point a – indent -1 (new)
— the identification of the issuing authority and the means to verify the authentication of the order;
Amendment 872 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective foraccording to which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
Amendment 877 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
Article 26 – paragraph 2 a (new)
2a. When conducting risk assessments, very large online platforms shall involve representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. Their involvement shall be tailored to the specific systemic risks that the very large online platform aim to assess.
Amendment 878 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider;
Amendment 879 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider;
Amendment 882 #
Proposal for a regulation
Article 9 – paragraph 2 – point c a (new)
Article 9 – paragraph 2 – point c a (new)
(ca) the order is issued only where no other effective means are available to receive the same specific item of information
Amendment 882 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
Amendment 884 #
Proposal for a regulation
Article 9 – paragraph 2 a (new)
Article 9 – paragraph 2 a (new)
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders. It shall ensure that the form meats the standards set down in the Annex of [XXX the regulation on European Production and Preservation Orders for electronic evidence in criminal matters].
Amendment 885 #
Proposal for a regulation
Article 9 – paragraph 2 b (new)
Article 9 – paragraph 2 b (new)
2b. When an order to provide a specific item of information about one or more specific individual recipients of the service is issued by a relevant national judicial or administrative authority, Member States shall ensure that the relevant national judicial or administrative authority duly informs the Digital Services Coordinator from the Member State of the judicial or administrative authority.
Amendment 885 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, design, the features or functioning of their services, or their terms and conditions;
Amendment 887 #
Proposal for a regulation
Article 9 – paragraph 4
Article 9 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative law in conformity with Union law.
Amendment 889 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of and targeting of advertisements in association with the service they provide;
Amendment 891 #
Proposal for a regulation
Chapter III – title
Chapter III – title
Due diligence oObligations for a transparent, accessible and safe online environment
Amendment 894 #
Proposal for a regulation
Article 9 a (new)
Article 9 a (new)
Article 9a Waiver 1. Providers of intermediary services may apply to the Commission for a waiver from the requirements of Chapter III, proved that they are: (a) non-for-profit or equivalent and serve a manifestly positive role in the public interest; (b) micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC; or (c) a medium enterprises within the meaning of the Annex to Recommendation 2003/361/EC without any systemic risk related to illegal content. The Providers shall present justified reasons for their request. 2. The Commission shall examination such an application and, after consulting the Board, may issue a waiver in whole or in parts to the requirements of this Chapter. 3. Upon the request of the Board or the provider, or on its own initiative, the Commission may review a waiver issued and revoke the waiver in whole or in parts. 4. The Commission shall maintain a list of all waivers issued and their conditions and shall publish this list to the public. (This amendment should be placed between the Chapter Title and the Section title)
Amendment 896 #
Proposal for a regulation
Article 9 a (new)
Article 9 a (new)
Article 9a Conflict between Union Acts 1. Where any obligation set down in this Regulation can be viewed as equivalent with or superseded by an obligation within another Union act, in which a provider of intermediary services is also a subject, a provider of intermediary services may apply to the Commission for a waiver from such requirements or a declaration that it should be deemed as having complied with this Regulation, in whole or in parts. The provider shall present justified reasons for their request. 2. The Commission shall examine such an application and, after consulting the Board, may issue a waiver or declaration in whole or in parts to the requirements of this Regulation. 3. Upon the request of the Board or on its own initiative, the Commission may review a waiver or declaration issued and revoke the waiver or declaration in whole or in parts. 4. The Commission shall maintain a list of all waiver and declaration issued and their conditions and shall publish this list to the public.
Amendment 898 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in article 27.1, it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in article 28.3.
Amendment 899 #
Proposal for a regulation
Article 27 – paragraph 1 b (new)
Article 27 – paragraph 1 b (new)
1b. The Board shall evaluate the implementation and effectiveness of mitigating measures undertaken by very large online platforms listed in Article 27(1) and where necessary, may issue recommendations.
Amendment 900 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which. The reports of the Board shall be broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. The reports shall include the following:
Amendment 902 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. Providers of intermediary services shall make public available and easy accessible the information necessary to easily identify and communicate with their single points of contact. This must include at least a telephone number, an e-mail address and a postal address of the point of contact. The provider may also include electronic contact forms.
Amendment 903 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrenteach of the systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 905 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
Article 10 – paragraph 2 a (new)
2a. Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision.
Amendment 908 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, mayand following public consultations shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
Amendment 911 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of theeach Member States where the provider offers its services.
Amendment 912 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of theeach Member States where the provider offers its services.
Amendment 913 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shallmay designate, in writing, a legal or natural person to act as their legal representative in one of the Member States where the provider offers its services.
Amendment 914 #
Proposal for a regulation
Article 11 – paragraph 1 – subparagraph 1 a (new)
Article 11 – paragraph 1 – subparagraph 1 a (new)
Where a provider of intermediary services chooses not to designate a legal representative, Article 40(3) shall apply.
Amendment 914 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in article 27
Amendment 917 #
Proposal for a regulation
Article 11 – paragraph 4
Article 11 – paragraph 4
4. Providers of intermediary services shall notify the name, address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is destablishignated. They shall ensure that that information is up to date.
Amendment 919 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which have been selected by the Commission and:
Amendment 921 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
Article 11 – paragraph 5 a (new)
5a. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC other than those which are either a very larger online platform or a marketplace.
Amendment 929 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions and on a designated web page that can be directly reached from the very large online platforms’ online interface, in a clear, accessible and easily comprehensible manner for the general public, the main parameters used in their recommender systems, the optimisation goals of their recommender systems as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
Amendment 931 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used by the provider of the intermediary service for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible, machine-readable format.
Amendment 933 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and ensure that their terms and conditions prevent the recipients of their services from providing information that is not compliant with Union law or the law of the Member State where the information is provided. Any additional restrictions that providers of intermediary services may impose in relation to the use of their service and the information provided by the recipients of the service shall be in full compliance with the fundambiguous language and shall be publicly available in an easily accessible formatental rights of the recipients of the services as enshrined in the EU Charter on Fundamental Rights.
Amendment 935 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
Amendment 939 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective and proportionatenon- arbitrary manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter and, where applicable, any community or other standards created by recipients of the service.
Amendment 940 #
Proposal for a regulation
Article 29 – paragraph 2
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide clear and easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
Amendment 941 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall actpply and enforce the restrictions referred to in paragraph 2 in a diligent, objective and, timely, proportionate manner in applying and enforcing the restrictions referred to in paragraph 1and non- discriminatory manner, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charternational and Union law, including the EU Charter on Fundamental Rights.
Amendment 942 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective, timely, and proportionate and non-discriminatory manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicabl. The fundamental rights of the recipients of the service as enshrined in the Charter shall be applied in particular when limitations imposed.
Amendment 945 #
Proposal for a regulation
Article 30 – title
Article 30 – title
Additional online advertising transparency and protection
Amendment 950 #
Proposal for a regulation
Article 30 – paragraph 2 – point a
Article 30 – paragraph 2 – point a
(a) the content of the advertisement, including the name of the product, service or brand and the object of the advertisement;
Amendment 952 #
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
Article 30 – paragraph 2 – point b a (new)
(ba) the natural or legal person or group who paid for the advertisement;
Amendment 955 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached in each country and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.
Amendment 957 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. The very large online platform shall design and organise its online interface in such a way that recipients of the service can easily and efficiently exercise their rights under applicable Union law in relation to the processing of their personal data for each specific advertisement displayed to the data subject on the platform, in particular: (a) To withdraw consent or to object to processing (b) To obtain access to the personal data concerning the data subject (c) To obtain rectification of inaccurate personal data concerning the data subject (d) To obtain erasure of personal data without undue delay (e) Where a recipient exercises any of these rights, the online platform must inform any parties to whom the personal data concerned in points (a)-(d) have been enclosed in accordance with Article 19 of Regulation (EU) 2016/679.
Amendment 960 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
Article 30 – paragraph 2 b (new)
2b. Very large online platforms shall be prohibited from profiling or targeting minors with personalised advertising, in compliance with the industry-standards laid down in Article 34 and Regulation (EU) 2016/679.
Amendment 962 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies available.
Amendment 962 #
Proposal for a regulation
Article 30 – paragraph 2 c (new)
Article 30 – paragraph 2 c (new)
2c. Very large online platforms shall take adequate measures to detect inauthentic videos (‘deep fakes’). When detecting such videos, they should label them as inauthentic in a way that is clearly visible for the internet user.
Amendment 963 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall not require recipients of the service other than traders to make their legal identity public in order to use the service.
Amendment 963 #
Proposal for a regulation
Article 30 – paragraph 2 d (new)
Article 30 – paragraph 2 d (new)
2d. Very large online platforms shall offer users the opportunity to check if their username and password have been compromised in a data leak, such as through the pwned open source database.
Amendment 964 #
Proposal for a regulation
Article 12 – paragraph 2 d (new)
Article 12 – paragraph 2 d (new)
2d. For providers other than very large online platforms, nothing in this Regulation shall prevent a provider of intermediary services provider concerned from terminating the contractual relationship with its recipients without clause, in the situations provided for in the terms and conditions. Providers of a very large online platform shall issue a statement for the termination to the recipient, and the recipient shall have access to the internal complaint mechanism under Article 17 and the out- of-court mechanism under Article 18.
Amendment 966 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Article 12a General Risk Assessment and Mitigation Measures 1. Providers of intermediary services shall identify, analyse and assess, at least once and at each significant revision of a service thereafter, the potential misuse or other risks stemming from the functioning and use made of their services in the Union. Such a general risk assessment shall be specific to their services and shall include at least risks related to the dissemination of illegal content through their services and any contents that might have a negative effect on potential recipients of the service, especially minors and gender equality. 2. Providers of intermediary services which identify potential risks shall, wherever possible, attempt to put in place reasonable, proportionate and effective mitigation measures in line with their terms and conditions. 3. Where the identified risk relates to minors, without regard to if the child is acting with respect to the terms and conditions, mitigation measures shall include, taking into account the industry standards referred to in Article 34, where needed and applicable: (a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure those prioritise the best interests of the child; (b) adapting or removing system design features that expose or promote to children to content, contact, conduct and contract risks; (c) ensuring the highest levels of privacy, safety, and security by design and default for children including any profiling or use of data for commercial purposes; (d) if a service is targeted at children, provide child-friendly mechanisms for remedy and redress, including easy access to expert advice and support. 4. Providers of intermediary services shall, upon request, explain to the Digital Services Coordinator of the Member State of establishment, how it undertook this risk assessment and what voluntary mitigation measures it undertook.
Amendment 967 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Article 12a General Risk Assessment and Mitigation Measures 1. Providers of intermediary services shall identify, analyse and assess, at least once and at each significant revision of a service thereafter, the potential misuse or other risks stemming from the functioning and use made of their services in the Union. Such a general risk assessment shall be specific to their services and shall include at least risks related to the dissemination of illegal content through their services and any contents that might have a negative effect on potential recipients of the service, especially minors. 2. Providers of intermediary services which identify potential risks shall. wherever possible, attempt to put in place reasonable, proportionate and effective mitigation measures in line with their terms and conditions. 3. Where the identified risk relations to minor recipients of the service, without regard to if the minor is acting with respect to the terms and conditions, mitigation measures shall include, where needed and applicable: (a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure those prioritise the best interests of the minor; (b) adapting or removing system design features that expose or promote to minors to content, contact, conduct and contract risks; (c) ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 16, including any profiling or use of data for commercial purposes; (d) if a service is targeted at minors, provide child-friendly mechanisms for remedy and redress, including easy access to expert advice and support. 4. Providers of intermediary services shall, upon request, explain to the Digital Services Coordinator of the Member State of establishment, how they undertook this risk assessment and what voluntary mitigation measures they undertook.
Amendment 969 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Article 12a Protection of minors 1. If a service is primarily aimed at minors, the providers of intermediary services shall explain conditions and restrictions for the use of the service in an age appropriate way and in a way that is applicable with rules for children's consent in accordance with Article 8 of GDPR. 2. The design and interface of services aimed at and highly used by minors must take into account that children do not have the same well- developed cognitive abilities as adults which makes them more vulnerable to manipulation. Therefore the services shall design their services in a way that children against manipulation and 'dark patterns'
Amendment 972 #
Proposal for a regulation
Article 12 b (new)
Article 12 b (new)
Amendment 974 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish in an easily accessible manner, at least ontwice a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. The reports must be searchable and archived for further use. Those reports shall include, in particular, information on the following, as applicable:
Amendment 976 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily accessible, comprehensible, and detailed reports on any content moderation they engaged in during the relevant period. The reports shall be available in searchable archives. Those reports shall include, in particular, information on the following, as applicable:
Amendment 977 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports to their Digital Services Coordinator of establishment on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:
Amendment 981 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed to inform taking the action specified in thoshe authority issuing the order of its receipt and the effect given to the orders;
Amendment 983 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; Providers of intermediary services may add additional information as to the reasons for the average time for taking the action.
Amendment 999 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
Article 13 – paragraph 1 a (new)
1a. Where providers of intermediary services do not make the report under paragraph 1 available to the general public, at least a summary of the report under paragraph 1 shall be made available to the general public.
Amendment 1000 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
Article 13 – paragraph 1 a (new)
1a. The information provided shall be broken down per Member State in which services are offered and in the Union as a whole.
Amendment 1003 #
Proposal for a regulation
Article 33 – paragraph 2 a (new)
Article 33 – paragraph 2 a (new)
2a. The reports shall include content moderation broken down per Member State in which the services are offered and in the Union as a whole and shall be published in the official languages of the Member States of the Union.
Amendment 1005 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which are not very large online platforms in accordance with Article 25.
Amendment 1007 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 and 1a shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 1010 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
Article 13 – paragraph 2 a (new)
2a. Where made available to the public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions.
Amendment 1010 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
Article 34 – paragraph 1 a (new)
1a. The Commission shall support and promote the development and implementation of standards set by relevant European and international standardisation bodies, subject to transparent, multi-stakeholder and inclusive processes in line with Regulation (EU) 1025/2012, for the protection and promotion of the rights of the child, observance of which, once adopted will be mandatory for very large online platforms, at least for the following: (a) Age assurance and age verification; (b) Child impact assessments; (c) Child-centred and age-appropriate design; (d) Child-centred and age-appropriate terms and conditions.
Amendment 1013 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13a Fair consent choice screens 1. Providers of intermediary services that ask the recipients of their service for consent as required by Regulation 2016/679 to collect or process personal data concerning them shall ensure that the end user choice screens shown to that end are designed in a fair and neutral manner and do not in any way subvert or impair user autonomy, decision-making, or choice via the choice screens’ structure, function or manner of operation. In particular, providers shall refrain from: (a) giving more visual prominence to any of the consent options when asking the recipient of the service for a decision;(b) repeatedly requesting that a recipient of the service consents to data processing, regardless of the scope of purpose of such processing, especially by presenting a pop-up that interferes with user experience; (b) urging a recipient of the service to change any setting or configuration of the service after the person in question has already made her choice, including by the use of a technical standard in accordance with paragraph 4; (c) making the procedure of cancelling a service more cumbersome then signing up to it. 2. The Commission may adopt implementing acts to prescribe binding design aspects and functions of consent choice screens that fulfil the requirements of paragraph 1. 3. Providers of intermediary services shall accept the communication of consent choices made by the recipient of the service through automated means, including through standardised digital signals sent by the recipient’s software used to access the service such as web browsers and operating systems. 4. The Commission shall promote and facilitate the development of technical standards for the automated communication of consent choices through international and EU standardisation bodies. Where standardisation bodies fail to develop a workable technical standard, the Commission shall, not later than two years after entry into force of this Regulation, designate a binding technical standard for the purpose of paragraph 3.
Amendment 1022 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services, providers of live streaming platform services and of private messaging services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content, or content that is in breach with their terms and conditions. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means and may include: (a) a clearly identifiable banner or single reporting button, allowing users to notify quickly and easily the providers of these services of illegal content they have encountered; (b) providing information to the users on what is considered illegal content under Union and national law; (c) providing information to the users on available national public tools to signal illegal content to the competent authorities.
Amendment 1023 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or non-governmental entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means and may include: (a) a clearly identifiable banner or single reporting button, allowing the users of those services to notify quickly and easily the providers of hosting services; (b) providing information to the users on what is considered illegal content under Union and national law; (c) providing information to the users on available national public tools to signal illegal content to the competent authorities in Member States were the service is directed.
Amendment 1023 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain a set of harmonised key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. in order to facilitate effective cross-platform monitoring
Amendment 1026 #
Proposal for a regulation
Article 35 – paragraph 4
Article 35 – paragraph 4
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusion, and publish their conclusions. Furthermore, they shall ensure that there is common alert mechanism managed at EU level to allow for real-time and coordinated responses.
Amendment 1028 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, clearly visible, user- friendly, located in close proximity to the content and allow for the submission of notices exclusively by electronic means.
Amendment 1033 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality or the breach of the content in question with the terms and conditions. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 1038 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content, or content that is in breach with providers' terms and conditions;
Amendment 1038 #
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
Article 36 – paragraph 2 – point b a (new)
(ba) the setting-up of unique identifier that will enable advertisers and publishers to identify and track a campaign throughout its lifecycle.
Amendment 1042 #
Proposal for a regulation
Article 36 a (new)
Article 36 a (new)
Amendment 1043 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content, or content that is in breach with providers' terms and conditions;
Amendment 1050 #
Proposal for a regulation
Article 38 – paragraph 4 a (new)
Article 38 – paragraph 4 a (new)
4a. Member States shall ensure that the competent authorities have adequate financial and human resources, as well as legal and technical expertise to fulfil their tasks under this Regulation.
Amendment 1052 #
Proposal for a regulation
Article 14 – paragraph 2 – point d
Article 14 – paragraph 2 – point d
(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete to the best available knowledge.
Amendment 1053 #
Proposal for a regulation
Article 40 – paragraph 1
Article 40 – paragraph 1
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation and final jurisdiction as to disputes on orders issued under Article 8 and 9.
Amendment 1057 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. NAdequately substantiated notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Arn obligation to investigate the notice in an effective and timely manner. If a provider is unable to determine if a noticle 5 in respect of the specific item of information concernedis valid, a provider may ask the Digital Service Coordinator or other national administrative bodies for an opinion before removing or disabling the content.
Amendment 1074 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services, of live streaming platform services and of private messaging services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, or in respect of the recipient of the service who provided this information, in a timely, diligent non-discriminatory and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 1075 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-discriminatory and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 1076 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and, objective and non- discriminatory manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 1084 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. If the recipient of services notices the hosting services their disagreement with the automated means of decision- making, hosting services must ensure human review of the decision-making process before any action taken
Amendment 1088 #
Proposal for a regulation
Article 45 – paragraph 4
Article 45 – paragraph 4
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of anythe result of the investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation. The Digital Services Coordinator shall at least conduct a preliminary assessment of the issue raised.
Amendment 1093 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or, disable access to or otherwise restrict the visibility of specific items of information provided by the recipients of the service or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or, removing or disabling access to or reducing the visibility of that information and of the reason for its decision, it shall inform the recipient on a durable medium, at the latest at the time of the removal or disabling of access or the restriction of visibility or the suspension or termination of monetization, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 1094 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or, disable access to or otherwise restrict the visibility of specific items of information provided by the recipients of the service or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or, removing or disabling access to or reducing the visibility of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access or the restriction of visibility or the suspension or termination of monetization, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 1104 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the restriction of the visibility of, or the demonetisation of, the information and, where relevant, the territorial scope of the disabling of access; or the restriction;
Amendment 1105 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the restriction of the visibility of, or the demonetisation of, the information and, where relevant, the territorial scope of the disabling of access or the restriction;
Amendment 1108 #
Proposal for a regulation
Article 15 – paragraph 2 – point b
Article 15 – paragraph 2 – point b
(b) the facts and circumstances relied on in taking the decision, including where relevant whether the decision was taken pursuant to a notice submitted in accordance with Article 14 and where appropriate, the identity of the notifier;
Amendment 1112 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
Amendment 1119 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
4. Providers of hosting services shall publish at least annually the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal data.
Amendment 1121 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
Article 15 – paragraph 4 a (new)
4a. Paragraph1 shall not apply where: - a provider of hosting service does not have the information necessary to inform the recipient by a durable medium; - a provider of hosting service has already informed the recipient of the removal or disabling of the same or similar items of information from the same recipient; - content is manifestly illegal; - content is deceptive, high-volume commercial content; or - requested by a judicial or law enforcement authority to not inform the recipient due to an ongoing criminal investigations until the criminal investigations is closed.
Amendment 1124 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
Amendment 1128 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
Article 51 – paragraph 2 – introductory part
2. Where then Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
Amendment 1130 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15a Providers of hosting services shall not use ex-ante control measures based on automated tools or upload-filtering of content for content moderation. Where providers of hosting services use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and on going training on the applicable legislation and international human rights standards as well as appropriate working conditions are provided to staff. This paragraph shall not apply to moderating information which has most likely been provided by automated tools.
Amendment 1131 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15a Alternative mechanisms based on an adequacy decision 1. Where a platform has an existing alternative notice and action mechanisms as set down by the law of a third country or in accordance with other Union law, upon a request by a provider, the Commission may issue a decision that declare these mechanisms as ensuring an adequate level of protection and fulfilling the requirements in Article 14 and Article 15. Before issues any such decision, the Commission shall consult the Board and the general public at least one month before the decision is adopted.
Amendment 1141 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which are not very large online platforms in accordance with Article 25.
Amendment 1143 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, as well as individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the followingdecision taken by the online platform not to act after having received a notice, and against the decisions taken by the online platform on the ground that the information provided by the recipients is illegal content under Union or national law, or incompatible with its terms and conditions:
Amendment 1154 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable access to or restrict the visibility of the information;
Amendment 1167 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to restrict the ability to monetise content provided by the recipients;
Amendment 1169 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to restrict the ability to monetize content provided by the recipients.
Amendment 1170 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
Article 17 – paragraph 1 – point c b (new)
(cb) decisions of online marketplaces to suspend the provisions of their services to traders;
Amendment 1173 #
Proposal for a regulation
Article 17 – paragraph 1 a (new)
Article 17 – paragraph 1 a (new)
1a. When the decision to remove or disable access to the information is followed by the transmission of this information in accordance with Article 15a, the period of at least six months as set out in paragraph 1 shall be considered to start from the day on which the recipient was informed in accordance with Article 15(2).
Amendment 1183 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timdiligently, objectively and without undue delay, diligent and objective mannerbut no later than 10 days after submission.. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 1193 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that recipients of the service are given the possibility, where necessary, to contact a human interlocutor at the time of the submission of the complaint and that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means.
Amendment 1195 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, that would negatively affect them and that are referred to in paragraph 4, are not solely taken on the basis of automated means.
Amendment 1201 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 and established in the Member State of the provider or the Member State of the recipient, in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 1202 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. Out-of-court dispute settlement shall be carried out within 45 days after submission.
Amendment 1204 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 1207 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
Article 18 – paragraph 1 a (new)
1a. Where a recipient seeks a resolved to multiple complaints, either party may request that the out-of-court dispute settlement body treats and resolves these complaints in a single dispute decision.
Amendment 1211 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and is legally distinct from and functionally independent of the government of the Member State or any other public or related private body;
Amendment 1212 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and is legally distinct from and functionally independent of the government of the Member State or any other public or private body;
Amendment 1225 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
Amendment 1234 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
Article 18 – paragraph 2 – subparagraph 1 – point d
(d) it is capable of settling dispute in a swift, efficient, accessible for persons with disabilities and cost-effective manner and in at least one official language of the Union;
Amendment 1244 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
Article 18 – paragraph 2 a (new)
2a. Certified out of court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time and no later than 90 calendar days after the date on which the certified body has received the complaint.
Amendment 1248 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
Article 18 – paragraph 3 – subparagraph 1
If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, and the body does not find the recipient acted in bad faith in the dispute, the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
Amendment 1253 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6a. This Article is without prejudice to the provisions laid down in Article 43 concerning the ability of recipients of the services to file complaints with the Digital Services Coordinator of their country of residence or in the case of very large online platforms, the Commission.
Amendment 1255 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6a. This Article shall only take effect on providers other than very large online platforms from [24 months after the date of entry into force of this Regulation].
Amendment 1256 #
Proposal for a regulation
Article 18 a (new)
Article 18 a (new)
Article 18a Burden of proof The rules on the burden of proof shall be shifted back to the providers of hosting services whether an information constitutes legal or illegal content.
Amendment 1268 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content, as well as intentional manipulation and exploitation of the service in the sense of Article 26, paragraph 1(c);
Amendment 1273 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform, law enforcement, or other government or relevant commercial entity;
Amendment 1274 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform, law enforcement, or other government or relevant commercial entity;
Amendment 1284 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
Article 19 – paragraph 2 – point c a (new)
(ca) it is legally distinct from and functionally independent of the government of the Member State or any other public or private body;
Amendment 1285 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
Article 19 – paragraph 2 – point c a (new)
(ca) it has a transparent funding structure, including publishing the sources and amounts of all revenue annually
Amendment 1286 #
Proposal for a regulation
Article 19 – paragraph 2 – point c b (new)
Article 19 – paragraph 2 – point c b (new)
(cb) it is not already a trusted flagger in another Member State.
Amendment 1287 #
Proposal for a regulation
Article 19 – paragraph 2 – point c c (new)
Article 19 – paragraph 2 – point c c (new)
(cc) it publishes, at least once a year, clear, easily comprehensible and detailed reports on any notices submitted in accordance with Article 14 during the relevant period. The report shall list notices categorised by the identity of the hosting service provider, the type of alleged illegal or terms and conditions violating content concerned, and what action was taken by the provider. In addition, the report shall identify relationships between the trusted flagger and any online platform, law enforcement, or other government or relevant commercial entity, and explain the means by which the trusted flagger maintains its independence.
Amendment 1288 #
Proposal for a regulation
Article 19 – paragraph 2 – subparagraph 1 a (new)
Article 19 – paragraph 2 – subparagraph 1 a (new)
By way of derogation from point (b), a public entity may be awarded with the status of trusted flagger for non- intellectual property right related actions.
Amendment 1289 #
Proposal for a regulation
Article 19 – paragraph 2 a (new)
Article 19 – paragraph 2 a (new)
Amendment 1301 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
Article 19 – paragraph 4 a (new)
4a. Member States may recognise entities, that were awarded the status of trusted flaggers in another Member State as a trusted flagger on their own territory. Upon request by a Member State, trusted flaggers can be awarded the status of European trusted flagger by the Board, in accordance with Article 48, paragraph 2. The Commission shall keep register of European trusted flaggers.
Amendment 1303 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. During this period of investigation by the Digital Services Coordinator, the trusted flagger shall be treated as a non-trusted flagger when using the mechanisms referred to in Article 14, where not suspended under Article 20.
Amendment 1309 #
Proposal for a regulation
Article 19 – paragraph 6
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received byfrom third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
Amendment 1317 #
Proposal for a regulation
Article 19 a (new)
Article 19 a (new)
Amendment 1320 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content, or content that is in breach with their terms and conditions.
Amendment 1322 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and where proportionate after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
Amendment 1333 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequentpeatedly submit notices or complaints that are manifestly unfounded.
Amendment 1338 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
Article 20 – paragraph 3 – point d
(d) the intention of the recipient, individual, entity or complainant., including whether submissions were made in bad faith;
Amendment 1340 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
Article 20 – paragraph 3 – point d a (new)
(da) whether a notice was submitted by an individual user or by an entity or persons with specific expertise related to the content in question;
Amendment 1344 #
Proposal for a regulation
Article 20 – paragraph 3 – point d b (new)
Article 20 – paragraph 3 – point d b (new)
(db) the manner of how notices have been submitted, including by automated means.
Amendment 1345 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
Article 20 – paragraph 3 a (new)
3a. Suspensions referred to in paragraphs 1 and 2 may be declared permanent where (a) compelling reasons of law or public policy, including ongoing criminal investigations, justify avoiding or postponing notice to the recipient; (b) the items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts; or (c) the items removed were related to content covered by [Directive 2011/93/EU updated reference] or [Directive (EU) 2017/541 or Regulation (EU) 2021/784 of the European Parliament and of the Council.
Amendment 1348 #
Proposal for a regulation
Article 20 – paragraph 4
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regardexamples as the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
Amendment 1351 #
Proposal for a regulation
Article 21
Article 21
Amendment 1363 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
Article 21 – paragraph 2 a (new)
2a. Unless instructed otherwise by the informed authority, the provider shall remove or disable the content. It shall store all content and related data for at least six months.
Amendment 1364 #
Proposal for a regulation
Article 21 – paragraph 2 b (new)
Article 21 – paragraph 2 b (new)
2b. Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified.
Amendment 1365 #
Proposal for a regulation
Article 21 – paragraph 2 c (new)
Article 21 – paragraph 2 c (new)
2c. The Commission shall adopt an implementing act setting down a template for notifications under paragraph 1.
Amendment 1366 #
Proposal for a regulation
Article 21 – paragraph 2 d (new)
Article 21 – paragraph 2 d (new)
2d. Where a notification of suspicions of criminal offences includes information which may be seen as potential electronic information in criminal proceedings, Regulation XXX [E-evidence] shall apply.
Amendment 1368 #
Proposal for a regulation
Article 22 – title
Article 22 – title
Traceability of traders on online Marketplaces
Amendment 1369 #
Proposal for a regulation
Article 22 – title
Article 22 – title
Traceability of traders on online marketplaces
Amendment 1373 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, itProviders of online marketplaces shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of itstheir services, the online platformmarketplaces hasve obtained the following information:
Amendment 1375 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, itProviders of online marketplaces shall ensure that traders can only use itstheir services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of itstheir services for those purposes, the online platformmarketplace has obtained the following information from traders, where applicable:
Amendment 1385 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
Article 22 – paragraph 1 – point c
(c) the bankpayment account details of the trader, where the trader is a natural person;
Amendment 1386 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
Article 22 – paragraph 1 – point c
(c) the bankpayment account details of the trader, where the trader is a natural person;
Amendment 1388 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
Article 22 – paragraph 1 – point d
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning ofestablished in the Union and carrying out the tasks in accordance with Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or [Article XX of the General Product Safety Regulation], or any relevant act of Union law; __________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
Amendment 1389 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
Article 22 – paragraph 1 – point d
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of established in the Union and carrying out the tasks in accordance with Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or [Article XX of the General Product Safety Regulation] or any relevant act of Union law; __________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
Amendment 1393 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
Article 22 – paragraph 1 – point f
(f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law to the best of their abilities.
Amendment 1399 #
Proposal for a regulation
Article 22 – paragraph 1 a (new)
Article 22 – paragraph 1 a (new)
1a. Providers of online marketplaces shall require traders to provide the information referred to in points (a) and (e) immediately upon initial registration for its services. Traders shall be required to provide any supplementary material relating to the information requirements set out in Article 22(1) within a reasonable period, and prior to the use of the service and offering of products and services to consumer.
Amendment 1400 #
Proposal for a regulation
Article 22 – paragraph 1 a (new)
Article 22 – paragraph 1 a (new)
1a. Providers of online marketplaces shall require traders to provide the information referred to in points (a) and (e) immediately upon initial registration for its services. Traders shall be required to provide any supplementary material relating to the information requirements set out in Article 22(1) within a reasonable period, no later than before offering of products and services to consumer.
Amendment 1405 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platformproviders of online marketplaces shall, upon receiving that information, make reasonable and before allowing traders to use their services, make best efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliablaccurate through the use of any freely accessible official online database or online interface made available by an authorized administrator or a Member States or the Union or through direct requests to the trader to provide supporting documents from reliable sources.
Amendment 1406 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platformproviders of online marketplaces shall, upon receiving that information, make reasonablebest efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliablaccurate through the use of any freely accessible official online database or online interface made available by an authorised administrator or a Member States or the Union or through direct requests to the trader to provide supporting documents from reliable sources.
Amendment 1416 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Article 22 – paragraph 3 – subparagraph 1
Where the online platform obtainsproviders of online marketplaces obtains sufficient indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platformmarketplace shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
Amendment 1417 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Article 22 – paragraph 3 – subparagraph 1
Where the online platformproviders of online marketplaces obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platformonline marketplace shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
Amendment 1418 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2
Article 22 – paragraph 3 – subparagraph 2
Where the trader fails to correct or complete that information, the online platformproviders of online marketplaces shall suspend the provision of its service to the trader in relations to the offering of products or services to consumers located in the Union until the request is fully complied with.
Amendment 1419 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2
Article 22 – paragraph 3 – subparagraph 2
Where the trader fails to correct or complete that information, the online platformmarketplace shall suspend the provision of its service to the trader in relations to the offering of products or services to consumers located in the Union until the request is fully complied with.
Amendment 1424 #
Proposal for a regulation
Article 22 – paragraph 3 a (new)
Article 22 – paragraph 3 a (new)
3a. The providers of online marketplaces shall ensure that traders are given the ability to discuss any information viewed as inaccurate or incomplete directly with a trader before any suspension of services. This may take the form of the internal complaint- handling system under Article 17.
Amendment 1425 #
Proposal for a regulation
Article 22 – paragraph 3 a (new)
Article 22 – paragraph 3 a (new)
3a. The providers of online marketplaces shall ensure that traders are given the ability to discuss any information viewed as inaccurate or incomplete directly with a trader before any suspension of services. This may take the form of the internal complaint- handling system under Article 17.
Amendment 1426 #
Proposal for a regulation
Article 22 – paragraph 3 b (new)
Article 22 – paragraph 3 b (new)
3b. If an online marketplaces rejects an application for services or suspends services to a trader, the trader shall have recourse to the systems under Article 17 and Article 43 of this Regulation.
Amendment 1427 #
Proposal for a regulation
Article 22 – paragraph 3 b (new)
Article 22 – paragraph 3 b (new)
3b. If an online marketplace rejects an application for services or suspends services to a trader, the trader shall have recourse to the systems under Article 17 and Article 43 of this Regulation.
Amendment 1428 #
Proposal for a regulation
Article 22 – paragraph 3 c (new)
Article 22 – paragraph 3 c (new)
3c. Traders shall be solely liable for the accuracy the information provided and shall inform without delay the online marketplace of any changes to the information provided.
Amendment 1429 #
Proposal for a regulation
Article 22 – paragraph 3 c (new)
Article 22 – paragraph 3 c (new)
3c. Traders shall be solely liable for the accuracy of the information provided and shall inform without delay the online marketplace of any changes to the information provided.
Amendment 1432 #
Proposal for a regulation
Article 22 – paragraph 4
Article 22 – paragraph 4
4. The online platformmarketplace shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information no later than six months after the final conclusion of a distance contract.
Amendment 1436 #
Proposal for a regulation
Article 22 – paragraph 4
Article 22 – paragraph 4
4. The online platformproviders of online market places shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information.
Amendment 1438 #
Proposal for a regulation
Article 22 – paragraph 5
Article 22 – paragraph 5
5. Without prejudice to paragraph 2, the platformroviders of online marketplaces shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation.
Amendment 1439 #
Proposal for a regulation
Article 22 – paragraph 5
Article 22 – paragraph 5
5. Without prejudice to paragraph 2, the platformonline marketplace shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation.
Amendment 1443 #
Proposal for a regulation
Article 22 – paragraph 6
Article 22 – paragraph 6
6. The online platformproviders of online marketplaces shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner.
Amendment 1448 #
Proposal for a regulation
Article 22 – paragraph 7
Article 22 – paragraph 7
Amendment 1449 #
Proposal for a regulation
Article 22 – paragraph 7
Article 22 – paragraph 7
Amendment 1461 #
Proposal for a regulation
Article 22 a (new)
Article 22 a (new)
Article 22a Compliance by design 1. Providers of online marketplaces shall design and organise their online interface in a fair and user-friendly way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law. 2. The online interface shall allow traders to provide in particular the information referred to under paragraph 6 of Article 22, the information referred to in Article 6 of Directive 2011/83/EU on Consumers Rights, information allowing for the unequivocal identification of the product or the service, and, where applicable, information on sustainability of products, information on labelling, including CE marking, according to the Union legislation on product safety and compliance. 3. This Article is without prejudice to additional requirements under other Union acts, including the [General Product Safety Regulation] and [Market Surveillance Regulation]
Amendment 1462 #
Proposal for a regulation
Article 22 a (new)
Article 22 a (new)
Amendment 1466 #
Proposal for a regulation
Article 22 b (new)
Article 22 b (new)
Article 22b Right to information 1. Where a provider of an online marketplace becomes aware, irrespective of the means used to, of the illegal nature of a product or service offered through its services, it shall inform, wherever possible, those recipients of the service that had acquired such product or contracted such service during the last six months about the illegality, the identity of the trader and any means of redress. 2. Where the provider of the online marketplace does not have the contact details of the recipients of the service referred to in paragraph 1, the provider shall make publicly available and easily accessible on their online interface the information concerning the illegal products or services removed, the identity of the trader and any means of redress.
Amendment 1472 #
Proposal for a regulation
Article 23 – paragraph 1 – point c a (new)
Article 23 – paragraph 1 – point c a (new)
(ca) the number of advertisements that were removed, labelled or disabled by the online platform and justification of the decisions;
Amendment 1478 #
Proposal for a regulation
Article 23 – paragraph 4
Article 23 – paragraph 4
4. The Commission mayshall adopt implementing acts to establish a set of key performance indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
Amendment 1479 #
Proposal for a regulation
Article 23 – paragraph 4
Article 23 – paragraph 4
4. The Commission mayshall adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
Amendment 1480 #
Proposal for a regulation
Article 23 – paragraph 4 a (new)
Article 23 – paragraph 4 a (new)
4a. Where published to the general public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions.
Amendment 1483 #
Proposal for a regulation
Article 24 – title
Article 24 – title
Online advertising transparency and control
Amendment 1486 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
Article 24 – paragraph 1 – point a
(a) that the information displayed is an advertisementon the interface or parts thereof is an online advertisement, including through prominent and harmonised marking;
Amendment 1487 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
Article 24 – paragraph 1 – point a
(a) that the information displayed is an advertisementon the interface or parts thereof is an online advertisement, including through prominent and harmonised marking;
Amendment 1488 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
Amendment 1493 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
Article 24 – paragraph 1 – point c
(c) clear, meaningful and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed. and the logic involved;
Amendment 1496 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
Article 24 – paragraph 1 – point c
(c) clear, meaningful and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed.
Amendment 1502 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
Article 24 – paragraph 1 – point c a (new)
(ca) if the advertisement was displayed using an automated tool and the identity of the person responsible for that tool;
Amendment 1505 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
The online platform shall design and organise its online interface in such a way that recipients of the service can easily and efficiently exercise their rights under applicable Union law in relation to the processing of their personal data for each specific advertisement displayed to the data subject on the platform, in particular: (a) to withdraw consent or to object to processing; (b) to obtain access to the personal data concerning the data subject; (c) to obtain rectification of inaccurate personal data concerning the data subject; (d) to obtain erasure of personal data without undue delay; Where a recipient exercises any of these rights, the online platform must inform any parties to whom the personal data concerned in points (a)-(d) have been enclosed in accordance with Article 19 of Regulation (EU) 2016/679.
Amendment 1506 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
Without prejudice to other Union acts, online platforms that display user- generated content that may include sponsored information or other information equivalent to advertising, which is normally provided against remuneration, shall including in their terms and conditions an obligation for the recipients of their service to inform other recipients of when they have received remuneration or any other goods in kind for their content. A failure to inform the platform or other recipients shall constitute a violation of the provider’s terms and conditions.
Amendment 1511 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
The Commission shall adopt an implementing act establishing harmonised specifications for the marking referred to in paragraph 1(a) of this Article.
Amendment 1513 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Article 24 – paragraph 1 b (new)
Where a recipient exercises any of the rights referred to points (a), (c) or (d) in paragraph 2, the online platform must immediately cease displaying advertisements using the personal data concerned or using parameters which were set using this data.
Amendment 1514 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Article 24 – paragraph 1 b (new)
Providers of intermediary services shall inform the natural or legal person on whose behalf the advertisement is displayed where the advertisement has been displayed.
Amendment 1515 #
Proposal for a regulation
Article 24 – paragraph 1 c (new)
Article 24 – paragraph 1 c (new)
Online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request and obtain information on where their advertisements have been placed; (b) can request and obtain information on which broker treated their data; (c) can indicate on which specific location their ads cannot be placed. In case of non-compliance with this provision, advertisers shall have the right to judicial redress.
Amendment 1517 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Article 24a Recommender systems - prominence of public journalism 1. Online platforms shall ensure due prominence of public interest journalism on their services. Services that cater to special interests may be exempted from this obligation. Appropriate prominence measures should include the use of technical standards established in a participatory and transparent manner in order to identify media outlets and entities operating according to the highest, internationally recognized professional norms to produce reliable and accurate information. 2. Providers of public interest journalism shall be identified through voluntary, self-regulatory European standards or European standardization deliverables as defined by Regulation (EU) No. 1025/2012 (‘technical standards’), which are transparently developed, governed and enforced. Any of those standards shall be based on internationally accepted best-practices and ethical norms to serve as legitimate criteria to implement the due prominence obligation. The application of these technical standards must be attributed and disclosed by and to all parties involved. 3. Appropriate measures as per this provision shall not discriminate on the basis of content or viewpoint. Intermediaries shall not treat non- compliance with or non-usage of such technical standards as a reason to exclude, down rank, demote or otherwise actively affect the visibility or monetization of content in a negative way. In order to demonstrate compliance with their duty to ensure due prominence for public interest journalism on their services, online intermediaries shall establish mandatory transparent mechanisms and metrics of indexation, regarding the discoverability and visibility in search ranks, news feeds and products, including the provision of data and information on prioritization, personalization, and recommendation algorithms, audits and complaints in an accountable manner. 4. A Digital Services Coordinator shall monitor and assess if appropriate measures adopted by online intermediaries under this article are sufficient to contribute to media pluralism and diversity in their respective national markets. To this end, the Digital Services Coordinator should rely on self- regulatory and co-regulatory mechanisms. 5. Recipients of services shall always have a clear and easily accessible choice to opt out of the appropriate measures designed to ensure due prominence to public interest journalism.
Amendment 1522 #
Proposal for a regulation
Article 24 b (new)
Article 24 b (new)
Article 24b Transparency on algorithm modifications 1. Providers of online platforms shall be transparent about changes in their referencing and recommendation rules, even if made on an experimental basis, and shall immediately inform the regulators, their users and the authors of referenced content, allowing these changes to be foreseen by those affected by them. 2. Users may refer to the regulator to ask it to give its opinion on the negative impact of changes to the referencing and recommendation rules, so that it can require the platform to remedy this impact.
Amendment 1531 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platform services, live streaming platform services, private messaging services and search engine services which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
Amendment 1537 #
Proposal for a regulation
Article 25 – paragraph 3 – subparagraph 1 a (new)
Article 25 – paragraph 3 – subparagraph 1 a (new)
Such a methodology shall ensure the following in relations to active recipients: (1) automated interactions, accounts or data scans by a non-human (“bots”) are not included; (2) that the mere viewing of a service without purchase, logging in or otherwise active identification of a recipient shall not be seen as an active recipient; (3) that the number shall be based on each service individually; (4) that recipients connected on multiple devices are counted only once; (5) that indirect use of service, via a third party or linking, shall not be counted; (6) where an online platform is hosted by another provider of intermediary services, that the active recipients are assigned solely to the online platform closest to the recipient; (7) the average number is maintained for a period of at least six months.
Amendment 1545 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), on an ongoing basis and at least once a year thereafter, any significthe probability and severity of anty systemic risks stemming from the design, intrinsic characteristics, functioning and use made of their services in the Union. The risk assessment shall be broken down per Member State in which services are offered and in the Union as a whole. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 1550 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 1555 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
(a) the dissemination of illegal content through their serviand content that is in breach of their terms and conditions through their services, including unsafe and non- compliant products and services, in case of online marketplaces;
Amendment 1556 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
(a) the dissemination of illegal content and content that is in breach of their terms and conditions through their services;,
Amendment 1560 #
Proposal for a regulation
Article 26 – paragraph 1 – point a a (new)
Article 26 – paragraph 1 – point a a (new)
(aa) the funding of illegal content, including models based on advertisement;
Amendment 1563 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
Amendment 1564 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the EU Charter on Fundamental Rights , in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 1565 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 1573 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service and amplification of content that is in breach of their terms and conditions, including by means of inauthentic use, such as ‘deep fakes’ or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, democratic values, media freedom and freedom of expression of journalists, as well as their ability to verify facts, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 1576 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service and amplification of content that is in breach of their terms and conditions, including by means of inauthentic use, or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 1584 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how and whether their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is in compatible with their terms and conditions, as well as potential infringement of consumer rights by business active on the platform or platform themselves.
Amendment 1590 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how and whether their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
Amendment 1593 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
Article 26 – paragraph 2 a (new)
2a. When conducting risk assessments, very large online platforms shall involve representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. Their involvement shall be tailored to the specific systemic risks that the very large online platform aim to assess.
Amendment 1601 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measureseasures to mitigate the probability and severity of any, tailored to address the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 1602 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measureseasures to mitigate the probability and severity of any, tailored to address the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 1606 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
Amendment 1609 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, design, the features or functioning of their services, or their terms and conditions;
Amendment 1612 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, design, the features or functioning of their services, or their terms and conditions;
Amendment 1613 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of and targeting of advertisements in association with the service they provide or the alternative placement and display of public service advertisements or other related factual information;
Amendment 1614 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of and targeting of advertisements in association with the service they provide;
Amendment 1625 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1a. Very large online platforms shall, where appropriate, conduct their risk assessments referred in Article 26 and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. Where no such involvement is taken, this shall be made clear in the transparency report referred to in Article 33.
Amendment 1626 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in Article 28(3).
Amendment 1627 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1a. The Board shall evaluate the implementation and effectiveness of mitigating measures undertaken by very large online platforms listed in Article 27(1) and where necessary, may issue recommendations.
Amendment 1630 #
Proposal for a regulation
Article 27 – paragraph 1 b (new)
Article 27 – paragraph 1 b (new)
1b. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, to the Board in view of issuing specific recommendations and to independent auditors for the purposes of the audit report. Following the written explanation of the reasons of the very large online platforms not to put in place mitigating measures, and where necessary, the Board shall issue specific recommendations as to the mitigation measures that very large online platforms shall implement. Very large online platforms shall within one month from receiving of these recommendations, implement the recommended measures, or set out any alternative measures they intend to take to address the identified risks. In case of systemic failure of a very large online platform to take effective mitigating measures and in case of repeated non-compliance with the recommendations, the Board may advise the Commission and the Digital Services Coordinators to impose sanctions.
Amendment 1631 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which. The reports of the Board shall be broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. The reports shall include the following:
Amendment 1637 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrenteach of the systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 1641 #
Proposal for a regulation
Article 27 – paragraph 2 – subparagraph 1 a (new)
Article 27 – paragraph 2 – subparagraph 1 a (new)
The reports of the Board shall include information both broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union.
Amendment 1644 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, mayand following public consultations shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
Amendment 1649 #
Proposal for a regulation
Article 27 – paragraph 3 a (new)
Article 27 – paragraph 3 a (new)
3a. The requirement to put in place mitigation measures shall not require an obligation to impose general monitoring or active fact-finding obligations.
Amendment 1653 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
Amendment 1658 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27
Amendment 1659 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III, in particular the quality of the identification, analysis and assessment of the risks referred to in Article26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27;
Amendment 1664 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which have been selected by the Commission and:
Amendment 1672 #
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
Article 28 – paragraph 2 – point c a (new)
(ca) have been certified by the Commission for the performance of this task;
Amendment 1676 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
Article 28 – paragraph 3 – point f a (new)
(fa) a description of specific elements that could not be audited, and an explanation of why these could not be audited;
Amendment 1677 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
Article 28 – paragraph 3 – point f a (new)
(fa) a description of specific elements that could not be audited, and an explanation of why these could not be audited;
Amendment 1678 #
Proposal for a regulation
Article 28 – paragraph 3 – point f b (new)
Article 28 – paragraph 3 – point f b (new)
(fb) where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such conclusion.
Amendment 1679 #
Proposal for a regulation
Article 28 – paragraph 3 – point f b (new)
Article 28 – paragraph 3 – point f b (new)
(fb) where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such conclusion.
Amendment 1684 #
Proposal for a regulation
Article 28 – paragraph 4 a (new)
Article 28 – paragraph 4 a (new)
4a. Where an audit report finds in accordance with paragraph 1 that total compliance or partial compliance with only minor issues has been found, the very large online platform may request from the Commission a waiver or delay to further auditing reports. When granted, the maximum delay shall be two years since the last auditing report.
Amendment 1685 #
Proposal for a regulation
Article 28 – paragraph 4 b (new)
Article 28 – paragraph 4 b (new)
4b. Where an audit report contains information that could be misused in order to harm the security and privacy of receptions of the platform, the very large online platform may request from the Commission that such information is removed or summarised in any public version of the audit report. The Commission shall consider any such requests and may grant such a request if deemed merited.
Amendment 1691 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions and on a designated web page that can be directly reached and easily found from the very large online platforms’ online interface, in a clear, accessible and easily comprehensible manner for the general public, the main parameters used in their recommender systems, the optimisation goals of their recommender systems as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
Amendment 1692 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions and on a designated web page that can be directly reached from the very large online platforms’ online interface, in a clear, accessible and easily comprehensible manner for the general public, the main parameters used in their recommender systems, the optimisation goals of their recommender systems as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
Amendment 1695 #
Proposal for a regulation
Article 29 – paragraph 1 – subparagraph 1 a (new)
Article 29 – paragraph 1 – subparagraph 1 a (new)
This duty is without prejudice to any trade secrets regarding the underlying algorithms. Very large online platforms are not required to disclose any information which could easily be used to manipulate search results to the detriment of customers and other end users.
Amendment 1701 #
Proposal for a regulation
Article 29 – paragraph 2
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide clear and easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
Amendment 1704 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2a. Online platforms shall ensure that their online interface is designed in such a way that it does not risk misleading or manipulating the recipients of the service.
Amendment 1709 #
Proposal for a regulation
Article 30 – title
Article 30 – title
Additional online advertising transparencytransparency for online advertisements and ‘deep fakes’ audiovisual media
Amendment 1710 #
Proposal for a regulation
Article 30 – title
Article 30 – title
Additional transparency for online advertising transparencyand "deep fakes" audiovisual media
Amendment 1711 #
Proposal for a regulation
Article 30 – title
Article 30 – title
Additional online advertising transparency and protection
Amendment 1716 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available to relevant authorities and vetted researchers, meeting the requirements of Article 31(4), through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1721 #
Proposal for a regulation
Article 30 – paragraph 2 – point a
Article 30 – paragraph 2 – point a
(a) the content of the advertisement, including the name of the product, service or brand and the object of the advertisement;
Amendment 1722 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
Article 30 – paragraph 2 – point b
(b) tThe natural or legal person on whose behalf the advertisement is displayed and any related payments received;
Amendment 1724 #
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
Article 30 – paragraph 2 – point b a (new)
(ba) the natural or legal person who paid for the advertisement;
Amendment 1725 #
Proposal for a regulation
Article 30 – paragraph 2 – point c a (new)
Article 30 – paragraph 2 – point c a (new)
(ca) the natural or legal person or group who paid for the advertisement;
Amendment 1732 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached in each country and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.
Amendment 1738 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. Very large online platforms shall be prohibited from profiling children under the age of 16 for commercial practices, including personalized advertising, in compliance with industry- standards laid down in Article 34 and Regulation (EU) 2016/679.
Amendment 1739 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. The Board shall, after consulting trusted flaggers and vetted researchers, publish guidelines on the structure and organisation on repositories created pursuant to paragraph 1.
Amendment 1740 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. The Board shall, after consulting with trusted flaggers and vetted researchers, publish guidelines on the structure and organisation of repositories created pursuant to paragraph 1.
Amendment 1743 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
Article 30 – paragraph 2 b (new)
2b. Where a very large online platform becomes aware that a piece of content is a deep fake, the provider shall label the content in a way that informs that the content is inauthentic and that is clearly visible for the recipient of the services.
Amendment 1744 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
Article 30 – paragraph 2 b (new)
2b. Very large online platforms shall make their best effort to detect inauthentic videos (‘deep fakes’). When detecting such videos, they should label them as inauthentic in a way that is clearly visible for the internet user.
Amendment 1745 #
Proposal for a regulation
Article 30 – paragraph 2 c (new)
Article 30 – paragraph 2 c (new)
2c. The very large online platform shall design and organise its online interface in such a way that recipients of the service can easily and efficiently exercise their rights under applicable Union law in relation to the processing of their data for each specific advertisement displayed to the data subject on the platform, in particular: (a) to withdraw consent or to object to processing; (b) to obtain access to the data concerning the data subject; (c) to obtain rectification of inaccurate data concerning the data subject; (d) to obtain erasure of data without undue delay. Where a recipient exercises any of these rights, the online platform must inform any parties to whom the personal data concerned in points (a) to (d) have been enclosed.
Amendment 1746 #
Proposal for a regulation
Article 30 – paragraph 2 c (new)
Article 30 – paragraph 2 c (new)
2c. Very large online platforms selling advertising for display on their online interface, shall ensure via standard contractual clauses with the purchasers of advertising space that the content with which the advertisement is associated is compliant with the terms and conditions of the platform, or with the law of the Member States where the recipients of the service to whom the advertisement will be displayed is located.
Amendment 1747 #
Proposal for a regulation
Article 30 – paragraph 2 d (new)
Article 30 – paragraph 2 d (new)
2d. Very large online platforms that display advertising on their online interfaces shall conduct at their own expense, and upon request of advertisers , independent audits performed by organisations complying with the criteria set out in Article 28(2). Such audits shall be based on fair and proportionate conditions agreed between platforms and advertisers, shall be conducted with a reasonable frequency and shall entail: (a) conducting quantitative and qualitative assessment of cases where advertising is associated with illegal content or with content incompatible with platforms’ terms and conditions; (b) monitoring for and detecting of fraudulent use of their services to fund illegal activities; (c) assessing the performance of their tools in terms of brand safety. The audit report shall include opinion on the performance of platforms’ tools in terms of brand safety. Where the audit opinion is not positive, the report shall make operational recommendations to the platforms on specific measures in order to achieve compliance. The platforms shall make available to advertisers, upon request, the results of such audit.
Amendment 1748 #
Proposal for a regulation
Article 30 – paragraph 2 d (new)
Article 30 – paragraph 2 d (new)
2d. Where a recipient exercises any of the rights referred to points (a), (c) or(d) in paragraph 2c, the online platform must without undue delay cease displaying advertisements using the personal data concerned or using parameters which were set using this data.
Amendment 1749 #
Proposal for a regulation
Article 30 – paragraph 2 e (new)
Article 30 – paragraph 2 e (new)
2e. Very large online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request and obtain information on where their advertisements have been placed; (b) can request and obtain information on which broker treated their data;
Amendment 1750 #
Proposal for a regulation
Article 31 – paragraph 1
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable periodout delay, specified in the request, full access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. With regard to moderation and recommender systems, very large online platforms shall provide upon request the Digital Services Coordinator or the Commission with access to algorithms and associated data that allow the detection of possible biases which could lead to the dissemination of illegal content, or content that is in breach with their terms and conditions, or presents threats to fundamental rights including freedom of expression. Where a bias is detected, very large online platforms shall expeditiously correct it following the recommendations of the Digital Services Coordinator or the Commission. Very large online platforms should be able to demonstrate their compliance at every step of the process pursuant to this Article.
Amendment 1755 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article or civil society organisations engaged in monitoring Rule of Law, Fundamental Rights and European values, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1) or educational purposes.
Amendment 1757 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment, three Digital Services Coordinators of destination or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding and mitigation of systemic risks as set out in Articles 26(1) and 27.
Amendment 1760 #
Proposal for a regulation
Article 31 – paragraph 3 a (new)
Article 31 – paragraph 3 a (new)
3a. Upon request by the recipient of the service, or at least once a year, very large online platforms shall make available to the recipient of the service comprehensive information about the data concerning the recipient of the service that was used in the previous year. The information shall encompass a listing of the data that was collected, how it was used and with what third parties it was shared. Online platforms shall present this information in a way that makes it easy to understand.
Amendment 1765 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, scientific researchers shall be affiliated with academic institutions, be independent from commercial interests and the very large online platform it seeks data from, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 1768 #
Proposal for a regulation
Article 31 – paragraph 4 a (new)
Article 31 – paragraph 4 a (new)
4a. Where a very large online platform or a Digital Services Coordinator has grounds to believe that a researcher is acting outside the purpose of paragraph 2 or no longer respects the conditions of paragraph 4, access to data shall be withdrawn and the Digital Services Coordinator of establishment shall decide if and when access shall be restored and under what conditions.
Amendment 1773 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1, 2 and 23a and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1775 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: for the following reasons: (a) in case of request under paragraph 1, a very large online platform does not have and cannot obtain with reasonable effort access to the data; (b) in case of request under paragraph 2, a very large online platform does not have access to the data or providing access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets.
Amendment 1787 #
Proposal for a regulation
Article 31 – paragraph 7 a (new)
Article 31 – paragraph 7 a (new)
7a. Digital Service Coordinators and the Commission shall, once a year, report the following information: (a) the number of requests made to them as referred to in paragraphs 1 and 2; (b) the number of such requests that have been declined or withdrawn by the Digital Service Coordinator or the Commission and the reasons for which they have been declined or withdrawn, including following a request to the Digital Service Coordinator or the Commission from a very large online platform to amend a request as referred to in paragraphs 1 and 2.
Amendment 1788 #
Proposal for a regulation
Article 31 – paragraph 7 a (new)
Article 31 – paragraph 7 a (new)
7a. Upon completion of the research envisaged in Article 31(2), the vetted researchers shall make their research publicly available, taking into account the rights and interests of the recipients of the service concerned in compliance with Regulation (EU) 2016/679.
Amendment 1789 #
Proposal for a regulation
Article 31 – paragraph 7 b (new)
Article 31 – paragraph 7 b (new)
7b. Digital Service Coordinators and the Commission shall, once a year, report the following information: (a) the number of requests made to them as referred to in paragraphs 1 and 2; (b) the number of such requests that have been declined by the Digital Service Coordinator or the Commission and the reasons for which they have been declined; (c) the number of such requests that have been declined by the Digital Service Coordinator or the Commission, including the reasons for which they have been declined, following a request to the Digital Service Coordinator or the Commission from a very large online platform to amend a request as referred to in paragraphs 1 and 2.
Amendment 1799 #
Proposal for a regulation
Article 33 – paragraph 1 – subparagraph 1 a (new)
Article 33 – paragraph 1 – subparagraph 1 a (new)
Such reports shall include content moderation information separated and presented for each Member State in which the services are offered and for the Union as a whole. The reports shall be published in at least one of the official languages of the Member States of the Union in which services are offered.
Amendment 1802 #
Proposal for a regulation
Article 33 – paragraph 2 a (new)
Article 33 – paragraph 2 a (new)
2a. The reports shall include content moderation broken down per Member State in which the services are offered and in the Union as a whole and shall be published in the official languages of the Member States of the Union.
Amendment 1812 #
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
Article 34 – paragraph 1 – introductory part
1. The Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies subject to transparent, multi- stakeholder and inclusive processes in line with Regulation (EU) No. 1025/2012, at least for the following:
Amendment 1823 #
Proposal for a regulation
Article 34 – paragraph 1 – point f
Article 34 – paragraph 1 – point f
(f) transparency obligations under Article 24 and transmission of data between advertising intermediaries in support of transparency obligations pursuant to points (b) and (c) of Article 24.
Amendment 1827 #
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
Article 34 – paragraph 1 – point f a (new)
(fa) accessibility of elements and functions of online platforms and digital services for persons with disabilities aiming at consistency and coherence with existing harmonised accessibility requirements when these elements and functions are not already covered by existing harmonised European standards
Amendment 1828 #
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
Article 34 – paragraph 1 – point f a (new)
(fa) self-regulatory, certifiable and machine-readable criteria for the transparency of ownership and professionalism of editorial processes to identify reliable sources of information pursuant to Article 24 a;
Amendment 1835 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
Article 34 – paragraph 1 a (new)
1a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child, observance of which, once adopted will be mandatory for very large online platforms, at least for the following: (a) age assurance and age verification; (b) child impact assessments; (c) child-centred and age-appropriate design; (d) child-centred and age-appropriate terms and conditions.
Amendment 1840 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
Amendment 1847 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shallmay encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the propereffective application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on consumer protection, competition and the protection of personal data, as well as the Charter of Fundamental Rights.
Amendment 1851 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
Amendment 1858 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission mayshall invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1863 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain, the Commission and the Board shall ensure a balanced, inclusive, multi- stakeholder and transparent governance for the codes of conduct. The Commission and the Board shall ensure the participation and meaningful inclusion of civil society organisations representing the public interest, that the codes of conduct set out clear and precise provisions and fundamental rights objectives, contain effective and specific key performance indicators to measurevaluate the achievement of those objectivmeasures and take due account of the needs and interests of all interested parties, including particular citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment Board on any measures taken and their outcomes, as measured against the law and the key performance indicators that they contain.
Amendment 1867 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain a set of harmonised key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain in order to facilitate effective cross-platform monitoring.
Amendment 1868 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly as needed to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.
Amendment 1870 #
Proposal for a regulation
Article 35 – paragraph 4
Article 35 – paragraph 4
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusion, and publish their conclusions. Furthermore, they shall ensure that there is common alert mechanism managed at Unions level to allow for real-time and coordinated responses.
Amendment 1872 #
Proposal for a regulation
Article 35 – paragraph 5
Article 35 – paragraph 5
5. The Commission and the Board shall regularly and transparently monitor and evaluate the achievement of the objectives ofor failure to meet the codes of conduct, having regard to the key performance indicators that they may containis Regulation, other applicable law, feedback received by stakeholders, and the key performance indicators that they may contain. If the results of the evaluation show the code or codes of conduct are ineffective or that the commitments are not being met, the competent Digital Service Coordinators shall impose effective, proportionate and dissuasive sanctions. In addition, the Commission shall introduce a legislative proposal following the ordinary legislative procedure.
Amendment 1873 #
Proposal for a regulation
Article 35 – paragraph 5
Article 35 – paragraph 5
5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic and repetitive failure to comply with the Codes of Conduct, the Board shall as a measure of last resort take a decision to temporary suspend or definitely exclude platforms that do not meet their commitments as a signatory to the Codes of Conduct.
Amendment 1874 #
Proposal for a regulation
Article 35 – paragraph 5
Article 35 – paragraph 5
5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic and repetitive failure to comply with the Codes of Conduct, the Board shall as a measure of last resort take a decision to temporary suspend or definitely exclude platforms that do not meet their commitments as a signatory to the Codes of Conduct.
Amendment 1879 #
Proposal for a regulation
Article 36
Article 36
Amendment 1881 #
Proposal for a regulation
Article 36 – paragraph 1
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency infor all players in the online advertising value chain. beyond the requirements of Articles 24 and 30.
Amendment 1882 #
Proposal for a regulation
Article 36 – paragraph 1
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency infor all actors in the online advertising value chain, beyond the requirements of Articles 24 and 30.
Amendment 1888 #
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
Article 36 – paragraph 2 – point b a (new)
(ba) the setting-up of unique identifier that will enable advertisers and publishers to identify and track a campaign throughout its lifecycle.
Amendment 1889 #
Proposal for a regulation
Article 36 – paragraph 3
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall evaluate the application of those codes three years after the application of this Regulation.
Amendment 1890 #
Proposal for a regulation
Article 36 – paragraph 3
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall evaluate the application of those codes two years after the application of this Regulation.
Amendment 1891 #
Proposal for a regulation
Article 36 – paragraph 3 a (new)
Article 36 – paragraph 3 a (new)
3a. The Commission shall encourage all the players in the online advertising value chain to endorse and comply with the commitments stated in the codes of conduct.
Amendment 1892 #
Proposal for a regulation
Article 36 – paragraph 3 a (new)
Article 36 – paragraph 3 a (new)
3a. The Commission shall encourage all the actors in the online advertising eco-system to endorse and comply with the commitments stated in the codes of conduct.
Amendment 1893 #
Proposal for a regulation
Article 36 a (new)
Article 36 a (new)
Article 36a Codes of conduct for the protection of minors 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms and other relevant services providers and organisations representing minors, parents and civil society organisations or relevant authorities to further contribute to the protection of minors on online. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective protection of minors online, which respects their right as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment. The Commission shall aim to ensure that the codes of conduct address at least: (a) age verification and age assurance models, taking into account the industry standards referred to in article 34. (b) child-centred and age-appropriate design, taking into account the industry standards referred to in Article 34. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of the Regulation and their application no later than six months after that date.
Amendment 1909 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph -1 (new)
Article 38 – paragraph 2 – subparagraph -1 (new)
-1. Member States shall not designate the regulatory authorities referred to in Article 30 of the Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services as competent authorities or as Digital Services Coordinator.
Amendment 1910 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 1 a (new)
Article 38 – paragraph 2 – subparagraph 1 a (new)
When a Member State is subject to a procedure referred to in Article 7(1) or 7(2) of the Treaty on European Union or against whom a procedure based on Regulation 2020/2092 was initiated, the Commission shall additionally confirm that the Digital Services Coordinator proposed by that Member State fulfils the requirements laid down in Article 39 before that Digital Services Coordinator can be designated.
Amendment 1915 #
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 2 a (new)
Article 38 – paragraph 3 – subparagraph 2 a (new)
This paragraph applies mutatis mutandis to the certification process for out-of- court dispute settlement bodies as described in Article 18(2) and the award of the status of trusted flagger as described in Article 19(2).
Amendment 1917 #
Proposal for a regulation
Article 38 – paragraph 4 a (new)
Article 38 – paragraph 4 a (new)
4a. Member States shall ensure that the competent authorities have adequate financial and human resources, as well as legal and technical expertise to fulfil their tasks under this Regulation.
Amendment 1918 #
Proposal for a regulation
Article 38 – paragraph 4 a (new)
Article 38 – paragraph 4 a (new)
4a. Member States shall ensure that the competent authorities have adequate financial and human resources, as well as legal and technical expertise to fulfil their tasks under this Regulation.
Amendment 1921 #
Proposal for a regulation
Article 39 – paragraph 1 – subparagraph 1 a (new)
Article 39 – paragraph 1 – subparagraph 1 a (new)
Member States shall ensure that the Digital Services Coordinators are legally distinct from the government and functionally independent of their respective governments and of any other public or private body.
Amendment 1926 #
Proposal for a regulation
Article 40 – paragraph 1
Article 40 – paragraph 1
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation and final jurisdiction as to disputes on orders issued under Article 8 and 9.
Amendment 1928 #
Proposal for a regulation
Article 40 – paragraph 1 a (new)
Article 40 – paragraph 1 a (new)
1a. By means of derogation from paragraph 1, the Member State in which the consumers have their residence shall have jurisdiction for the purposes of Articles 22, 22a and 22b of this Regulation and the Member State in which the authority issuing the order is situated shall have jurisdiction for the purposes of Articles 8 and 9 of this Regulation.
Amendment 1935 #
Proposal for a regulation
Article 40 – paragraph 3 a (new)
Article 40 – paragraph 3 a (new)
3a. Paragraph 3 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which are not very large online platforms. Such enterprises shall be deemed to be under the jurisdiction of the Member State where their point of contact resides or is established. Where no point of contract is established or resides in a Member State, paragraph 3 shall apply.
Amendment 1939 #
Proposal for a regulation
Article 40 – paragraph 4
Article 40 – paragraph 4
4. Paragraphs 1,1a, 2 and 3 are without prejudice to Article 43(2), the second subparagraph of Article 50(4) and the second subparagraph of Article 51(2) and the tasks and powers of the Commission under Section 3.
Amendment 1954 #
Proposal for a regulation
Article 41 – paragraph 3 a (new)
Article 41 – paragraph 3 a (new)
3a. Following request to the Commission and in cases of infringements that persist, could cause serious harm to recipients of the service, or could seriously affect their fundamental rights, the Digital Services Coordinator of the country of destination may be entitled to additional powers in the framework of joint investigations as referred to in Article 46.
Amendment 1955 #
Proposal for a regulation
Article 41 – paragraph 6 a (new)
Article 41 – paragraph 6 a (new)
6a. The Commission shall publish guidelines by [six months after adoption] on the powers and procedures of the Digital Services Coordinators. Member States shall follow these guidelines or explain otherwise to the Commission.
Amendment 1962 #
Proposal for a regulation
Article 42 – paragraph 4 a (new)
Article 42 – paragraph 4 a (new)
4a. Member States shall ensure that administrative or judicial authorities issuing orders pursuant to Article 8 and 9 shall only issue penalties or fines in line with this Article.
Amendment 1963 #
Proposal for a regulation
Article 42 a (new)
Article 42 a (new)
Amendment 1969 #
Proposal for a regulation
Article 43 – paragraph 1
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established or with in the case of very large online platforms, the Commission. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment or in the case of very large online platforms, the Commission. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
Amendment 1972 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Article 43 – paragraph 1 a (new)
Where the complaint concerns an alleged harm upon the recipients of the service, the Member State where the recipient resides shall have jurisdiction for the purposes of the complaint.
Amendment 1974 #
Proposal for a regulation
Article 43 a (new)
Article 43 a (new)
Amendment 1986 #
Proposal for a regulation
Article 45 – paragraph 1 – subparagraph 2
Article 45 – paragraph 1 – subparagraph 2
Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may recommendshall request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
Amendment 1993 #
Proposal for a regulation
Article 45 – paragraph 2 a (new)
Article 45 – paragraph 2 a (new)
2a. A request or recommendation pursuant to paragraph 1 shall be at the same time as it is communicated to the Digital Services Coordinator of establishment be transmitted to the Commission. Where the Commission believes that the request or recommendation is unmerited or where the Commission is currently taking action on the same substantial matter, the Commission can ask for the request or recommendation to be withdrawn.
Amendment 1998 #
Proposal for a regulation
Article 45 – paragraph 4
Article 45 – paragraph 4
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto and a statement of reason in case of decision, following its investigation, not to take measures to ensure compliance with this Regulation.
Amendment 2006 #
Proposal for a regulation
Article 45 – paragraph 6
Article 45 – paragraph 6
6. The Commission , in cooperation with the Digital Services Coordinators shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Board.
Amendment 2010 #
Proposal for a regulation
Article 45 – paragraph 7
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission in cooperation with the Digital Services Coordinators concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. Where the Digital Services Coordinator of establishment fails to comply with the request to take the necessary measures before the end of the two months period, the Commission shall reallocate the case without delay to the Digital Services Coordinator initiating the request.
Amendment 2014 #
Proposal for a regulation
Article 46 – title
Article 46 – title
Joint investigations, cooperation among Digital Services Coordinators and requests for Commission intervention
Amendment 2017 #
Proposal for a regulation
Article 46 – paragraph 1 – subparagraph 1
Article 46 – paragraph 1 – subparagraph 1
Digital Services Coordinators may participate in joint investigations, which may be coordinated with the support of the Board, with regard to matters covered by this Regulation, concerning providers of intermediary services operating in several Member States. Such joint investigations shall be under the supervision of Digital Services Coordinator of establishment of the provider under investigation,
Amendment 2020 #
Proposal for a regulation
Article 46 – paragraph 1 a (new)
Article 46 – paragraph 1 a (new)
1a. Where Digital Services Coordinator of the country of destination considers that an alleged infringement exist and causes serious harm to a large number of recipients of the service in that Member States, or could seriously affect their fundamental rights, it may request to the Commission to set up joint investigations between Digital Services Coordinator of country of establishment and the requesting Digital Services Coordinator of country of destination.
Amendment 2021 #
Proposal for a regulation
Article 46 – paragraph 1 b (new)
Article 46 – paragraph 1 b (new)
1b. The Commission, in cooperation with the Digital Services Coordinators, shall assess such request and following positive opinion of the Board shall set up a joint investigation where the Digital Services Coordinator of the country of destination can be entitled to exercise the following additional powers with respect to the provider of intermediary services concerned by the alleged infringement: (a) to obtain access to the confidential version of the reports published by the intermediary service providers referred to in Article 13 and where applicable in Articles 23 and 24, as well as to the annual reports drawn up by the other competent authorities pursuant to Article 44; (b) to obtain access to data collected by the Digital Services Coordinator of the country of establishment for the purpose of supervision of that provider on the territory of the Digital Services Coordinator of the country of destination; (c) to initiate proceedings and assess the matter in view of taking specific investigatory or enforcement measures to ensure compliance, where the suspected seriousness of the infringement would require immediate response that would not allow for the provisions of Article 45 to apply; and (d) to request interim measures, as referred to in Article 41(2)(e).
Amendment 2022 #
Proposal for a regulation
Article 46 – paragraph 1 c (new)
Article 46 – paragraph 1 c (new)
1c. The Commission decision setting up the joint investigation shall define a deadline by when Digital Services Coordinator of the country of establishment and Digital Services Coordinator launching the request pursuant to paragraph 2 shall agree on a common position on the joint investigation, and where applicable on the enforcement measures to be adopted. If no agreement is reached within this deadline, the case shall be referred to the Commission pursuant to Article 45(5).
Amendment 2030 #
Proposal for a regulation
Article 47 – paragraph 1
Article 47 – paragraph 1
1. An independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary services named ‘European Board for Digital Services’ (the ‘Board’) is established and shall have legal personality.
Amendment 2049 #
Proposal for a regulation
Article 48 – paragraph 1
Article 48 – paragraph 1
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shallmay participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. Member State has more than one representative present, solely the final word of the Digital Services Coordinator shall be taken as the position of the Member State in question.
Amendment 2054 #
Proposal for a regulation
Article 48 – paragraph 2 – subparagraph 1 a (new)
Article 48 – paragraph 2 – subparagraph 1 a (new)
Where a Member State has more than one representative present, solely the Digital Services Coordinator shall be able to vote.
Amendment 2067 #
Proposal for a regulation
Article 48 – paragraph 5 a (new)
Article 48 – paragraph 5 a (new)
5a. The Board shall, where appropriate, consult interested parties and give them the opportunity to comment within a reasonable period. The Board shall make the results of the consultation procedure publicly available.
Amendment 2070 #
Proposal for a regulation
Article 48 – paragraph 6
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure by a two-thirds majority of its members, following the consent of the Commission.
Amendment 2081 #
Proposal for a regulation
Article 49 – paragraph 1 – point c a (new)
Article 49 – paragraph 1 – point c a (new)
(ca) continually develop guidance and best practices for the development and design of interfaces to minimise dark patterns;
Amendment 2082 #
Proposal for a regulation
Article 49 – paragraph 1 – point c a (new)
Article 49 – paragraph 1 – point c a (new)
(ca) issue specific recommendations for the implementation of Article 27 and advise on possible application of sanctions in case of repeated non-compliance;
Amendment 2092 #
Proposal for a regulation
Article 49 a (new)
Article 49 a (new)
Article 49a Reports 1. The Board shall draw up an annual report regarding its actions. The report shall be made public and be transmitted to the European Parliament, to the Council and to the Commission in all official languages of the Member States. 2. The annual report shall include, among other information, a review of the practical application of the opinions, guidelines, recommendations advice and any other measures taken under Article 49(1).
Amendment 2093 #
Proposal for a regulation
Article 49 b (new)
Article 49 b (new)
Article 49b Confidentiality 1. The discussions of the Board shall be confidential where the Board deems it necessary, as provided for in its rules of procedure. 2. Access to documents submitted to members of the Board, experts and representatives of third parties shall be governed by Regulation (EC) No 1049/2001 of the European Parliament and of the Council.
Amendment 2099 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
Amendment 2100 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period.
Amendment 2120 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
Amendment 2130 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Article 51 – paragraph 2 – subparagraph 1
Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
Amendment 2140 #
Proposal for a regulation
Article 51 a (new)
Article 51 a (new)
Article 51a Requirements for the Commission 1. The Commission shall perform its tasks under this Regulation in an impartial, transparent and timely manner. The Commission shall ensure that its units given responsibility for this regulation have the adequate technical, financial and human resources to carry out their tasks. 2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Commission shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party.
Amendment 2145 #
Proposal for a regulation
Article 52 – paragraph 2
Article 52 – paragraph 2
2. When sending a simple request for information to the very large online platform concerned or other person referred to in Article 52(1), the Commission shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which the information is to be provided, and the penalties provided for in Article 59 for supplying incorrect or misleading information. The purpose shall include reasoning on why and how the information is necessary, proportionality to the purpose and cannot be received by other means.
Amendment 2150 #
Proposal for a regulation
Article 52 – paragraph 4
Article 52 – paragraph 4
4. The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). Lawyers duly authorised to act may supply the information on behalf of their clients. The latter shall remain fully responsible if the information supplied is incomplete, incorrect or misleading.
Amendment 2179 #
Proposal for a regulation
Article 57 – paragraph 1
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor and audit the effective implementation and compliance with this Regulation and the Charter of Fundamental Rights by the very large online platform concerned, including the operation of any algorithm in the provision of its services. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms.
Amendment 2223 #
Proposal for a regulation
Article 59 – paragraph 4
Article 59 – paragraph 4
4. In fixing the amount of the fine, the Commission shall have regard to the nature, gravity, duration and recurrence of the infringement, any fines issued under Article 42 and need to avoid double sanctioning the same infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings.
Amendment 2282 #
Proposal for a regulation
Article 69 – paragraph 2
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 23, 25, 31 and 314 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
Amendment 2284 #
Proposal for a regulation
Article 69 – paragraph 3
Article 69 – paragraph 3
3. The delegation of power referred to in Articles 23, 25, 31 and 314 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
Amendment 2292 #
Proposal for a regulation
Article 73 – paragraph 4 a (new)
Article 73 – paragraph 4 a (new)
4a. By three years from the date of application of this Regulation at the latest, the Commission shall carry out an assessment of any impact of the costs to European service providers of any similar requirements, including those of Article 11, introduced by third-party states and any new barriers to non-EU market access after the adoption of this Regulation. The Commission shall also access the impact on the ability of European businesses and consumers to access and buy products and services from outside the Union.
Amendment 2293 #
Proposal for a regulation
Article 74 – paragraph 1 a (new)
Article 74 – paragraph 1 a (new)
1a. Chapter III, section 4 shall apply from [date - 3 months after its entry into force].
Amendment 2294 #
Proposal for a regulation
Article 74 – paragraph 2
Article 74 – paragraph 2
2. ItThis Regulation, with the exception of Chapter III section 4, shall apply from [date - threwelve months after its entry into force].