229 Amendments of Kim VAN SPARRENTAK related to 2020/0361(COD)
Amendment 192 #
Proposal for a regulation
Recital 4 a (new)
Recital 4 a (new)
(4a) Online advertisement plays an important role in the online environment, including in relation to the provision of the information society services. However, certain forms of online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to creating financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, to misleading or exploitative marketing or the discriminatory display of advertising with an impact on the equal treatment and the rights of consumers. Consumers are largely unaware of the volume and granularity of the data that is being collected and used to deliver personalised and micro-targeted advertisements, and have little agency and limited ways to stop or control data exploitation. The significant reach of a few online platforms, their access to extensive datasets and participation at multiple levels of the advertising value chain has created challenges for businesses, traditional media services and other market participants seeking to advertise or develop competing advertising services. In addition to the information requirements resulting from Article 6 of Directive 2000/31/EC, stricter rules on targeted advertising and micro-targeting are needed, in favour of less intrusive forms of advertising that do not require extensive tracking of the interaction and behaviour of recipients of the service. Therefore, providers of information society services may only deliver and display online advertising to a recipient or a group of recipients of the service when this is done based on contextual information, such as keywords or metadata. Providers should not deliver and display online advertising to a recipient or a clearly identifiable group of recipients of the service that is based on personal or inferred data relating to the recipients or groups of recipients. Where providers deliver and display advertisement, they should be required to ensure that the recipients of the service have certain individualised information necessary for them to understand why and on whose behalf the advertisement is displayed, including sponsored content and paid promotion.
Amendment 198 #
Proposal for a regulation
Recital 5 a (new)
Recital 5 a (new)
(5a) Given the cross-border nature of the services concerned, Union action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and ensure that equal right to access and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for intermediary services, including their online interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals.
Amendment 207 #
Proposal for a regulation
Recital 8
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targedirecting of activities towards one or more Member States. The targedirecting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targedirecting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. __________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
Amendment 215 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. Therefore, Chapter III (Articles 10 to 37) also applies as a horizontal framework mutatis mutandis to intermediary services when implementing other secondary legislation, to the extent no more specific rules are laid down. __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 222 #
Proposal for a regulation
Recital 11
Recital 11
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected.
Amendment 231 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadappropriately and also covers unlawful information directly relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that directly relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, illegally-traded animals the non- authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 247 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Accordingly, where access to information requires registration or admittance to a group of users, that information should be considered to be disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation may, in general, not be considered as a dissemination to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 254 #
Proposal for a regulation
Recital 15 a (new)
Recital 15 a (new)
(15a) Ensuring that providers of intermediary services can offer strong and effective end-to-end encryption is essential for trust in and security of digital services in the Digital Single Market, and effectively prevents unauthorised third- party access.
Amendment 275 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remov act expeditiously to remove or to disable access to content where it is evident to a layperson, without any substantive analysis, that the content is manifestly illegal or where it has become aware orf to disable access tohe unlawful nature of thate content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 289 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
Amendment 294 #
Proposal for a regulation
Recital 25
Recital 25
Amendment 295 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Article 2 a Targeting of digital advertising 1. Providers of information society services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of determining the recipients to whom advertisements are displayed. 2. This provision shall not prevent information society services from determining the recipients to whom advertisements are displayed on the basis of contextual information such as keywords, the language setting communicated by the device of the recipient or the geographical region of the recipients to whom an advertisement is displayed. 3. The use of the contextual information referred to in paragraph2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of one or more natural persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or persons.
Amendment 305 #
Proposal for a regulation
Recital 26
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed and open online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.
Amendment 308 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. Domain name system (DNS) registration services can also benefit from the exemptions from liability set out in this Regulation.
Amendment 317 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific cases and therefore, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
Amendment 323 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders. The applicable rules on the mutual recognition of court decisions should be unaffected.
Amendment 366 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of discriminatory, unfair or arbitrary outcomes.
Amendment 370 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report in a standardised and machine- readable format, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40, or as a not-for-profit service with fewer than 100.000 monthly active users. __________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 373 #
Proposal for a regulation
Recital 39 a (new)
Recital 39 a (new)
(39a) Recipients of the service should be empowered to make autonomous decisions inter alia regarding the acceptance of and changes to terms and conditions, advertising practices, privacy and other settings, recommender systems when interacting with intermediary services. However, dark patterns typically exploit cognitive biases and prompt online consumers to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose. Therefore, providers of intermediary services should be prohibited from deceiving or nudging recipients of the service and from subverting or impairing the autonomy, decision- making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof (‘dark patterns’). This includes, but is not limited to, exploitative design choices to direct the recipient to actions that benefit the provider of intermediary services, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, repetitively requesting or pressuring the recipient to make a decision or hiding or obscuring certain options.
Amendment 379 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Content that has been notified and that is not manifestly illegal should remain accessible while the assessment of its legality by the competent authority is still pending. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. Recipients of the service who provided the information to which the notice relates should be given the opportunity to reply before a decision is taken. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 412 #
Proposal for a regulation
Article 2 b (new)
Article 2 b (new)
Amendment 413 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent, accurate and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmentalnon- governmental organisations, consumer protection organisations, and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions onlinor discriminatory expressions online or to combatting digital violence or supporting victims of digital violence. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions and respect for exceptions and limitations to intellectual property rights. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 427 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion, while ensuring a high level of security of the information concerned in order to protect such information against accidental or unlawful destruction, accidental loss or alteration, or unauthorised or unlawful storage, processing, access or disclosure.. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 430 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 439 #
Proposal for a regulation
Recital 49 a (new)
Recital 49 a (new)
(49a) In order to contribute to a transparent online environment for consumers that supports the green transition, online platforms that allow consumers to conclude distant contracts with traders should provide consumers in real time with clear and unambiguous information on the environmental impact of its products and services, such as the use of sustainable and efficient delivery methods, sustainable and ecological packaging, as well as the environmental costs of returning goods in the event of withdrawal.
Amendment 450 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
(50a) In the light of effective enforcement of local rules to combat long-term rental housing shortages and to limit short-term holiday rentals, as was justified in the Cali Apartments case (cases C-724/18 and C-727/18), all natural or legal persons renting out short- term holiday rentals shall be subject to the obligations under Article 22 of this Regulation.
Amendment 453 #
Proposal for a regulation
Recital 52
Recital 52
Amendment 460 #
Proposal for a regulation
Recital 52 a (new)
Recital 52 a (new)
(52a) A core part of an online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, online platforms should ensure that recipients can understand how recommender system impact the way information is displayed, and can influence how information is presented to them. They should clearly present the parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters. Options not based on profiling of the recipient should be available and used by default.
Amendment 475 #
Proposal for a regulation
Recital 57
Recital 57
(57) ThreFive categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the intended use and misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products and illegally traded animals. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to technology design choices such as the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intended use of, malfunctioning of, as well as the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors or other vulnerable groups, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices, including undisclosed commercial communications published by recipients of the service that are not marketed, sold or arranged by the online platform. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. A fourth category concerns negative societal effects of technology design, value chain and business-model choices in relation to systemic risks that represent threats to democracy. A fifth category concerns environmental risks such as high electricity and water consumption, heat production and CO2 emissions related to the provision of the service and technical infrastructure or to user behaviour modification with a direct environmental impact, such as directing users to choose less sustainable options when it comes to delivery or packaging.
Amendment 489 #
Proposal for a regulation
Recital 60
Recital 60
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give thevetted auditors access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. AThis guarantee should not be a means to circumvent the applicability of audit obligations in this Regulation applicable to very large online platforms. Vetted auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
Amendment 491 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the BoardAgency without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the vetted auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the vetted auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
Amendment 493 #
Proposal for a regulation
Recital 61 a (new)
Recital 61 a (new)
(61a) In order to ensure a participative and inclusive approach and address societal concerns raised by the services of very large online platforms, it is necessary to set up a European Social Media Council at Union level. The transparency, inclusiveness and independence of the Council ensures that decisions on content moderation are shaped by a diverse range of expertise and perspectives. The Council should support the Agency and the Commission by issuing policy and implementation recommendations and help platforms improving and adjusting content moderation practices under terms and conditions. The Council should consist of independent experts, representatives of the recipients of the service, representatives of groups potentially impacted by their services, and civil society organisations. While not legally binding, the Councils’ recommendations will yield effective outcomes, incorporating a wider and more diverse range of inputs to societal challenges that very large online platforms may pose. Its strength and efficiency is based on voluntary compliance by platforms, whose commitment will be to respect and execute the Council’s recommendations in good faith. In order to function efficiently, the Council and its members should have sufficient human, material and financial resources at their disposal.
Amendment 494 #
Proposal for a regulation
Recital 62
Recital 62
Amendment 498 #
Proposal for a regulation
Recital 62 a (new)
Recital 62 a (new)
(62a) Recommender systems used by very large online platforms pose a particular risk in terms of consumer choice and lock-in effects. Consequently, in addition to the obligations applicable to all online platforms, very large online platforms should offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems, including through application programming interfaces.
Amendment 502 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the CommissionAgency may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers, civil society and media organisations on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the CommissionAgency and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, not-for-profit bodies, organisations or associations, or media organisations. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests of the platform and any other parties concerned, including tradhe srecrets and other confidential information, of the platform and any other parties concerned, including the recipients of the serviceipients of the service. To that end, the Commission should issue regulatory guidance to specify the modalities and safeguards for data access and sharing, and provide platforms with legal certainty while ensuring the independence of the research.
Amendment 517 #
Proposal for a regulation
Recital 67
Recital 67
(67) The Commission and the BoardAgency should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the BoardAgency, by participating in the same codes of conduct.
Amendment 532 #
Proposal for a regulation
Recital 71 a (new)
Recital 71 a (new)
(71a) In order to ensure that the systemic role of very large online platforms does not endanger the internal market by unfairly excluding innovative new entrants, including SMEs, entrepreneurs and start-ups, additional rules are needed to allow recipients of the service to switch or connect and interoperate between online platforms or internet ecosystems. Therefore, interoperability obligations should require very large online platforms to share appropriate tools, data, expertise, and resources. As part of those measures, the Commission should explore different technologies and open standards and protocols, including the possibility of technical interfaces (Application Programming Interface), that allow recipients of service or other market participants to access the key functionalities of very large online platforms to exchange information.
Amendment 581 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 603 #
Proposal for a regulation
Article 1 – paragraph 1 – point b a (new)
Article 1 – paragraph 1 – point b a (new)
(ba) rules on transparency, accountability and respect for fundamental rights as regards the design and implementation of voluntary, self- and co-regulatory measures;
Amendment 609 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 619 #
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
Article 1 – paragraph 2 – point b a (new)
(ba) achieve a high level of consumer protection in the Digital Single Market.
Amendment 627 #
Proposal for a regulation
Article 1 – paragraph 5 – point b
Article 1 – paragraph 5 – point b
(b) Directive 20108/13/EC808;
Amendment 633 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
Article 1 – paragraph 5 – point c
(c) Union lawDirective(EU) 2019/790 on copyright and related rights; in the Digital Single Market
Amendment 637 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(ia) Directive(EU) 2019/882.
Amendment 648 #
Proposal for a regulation
Article 2 – paragraph 1 – point b
Article 2 – paragraph 1 – point b
(b) ‘recipient of the service’ means any natural or legal person who, for professional ends or otherwise, uses the relevant intermediary service for seeking information or making it accessible;
Amendment 668 #
Proposal for a regulation
Article 2 – paragraph 1 – point e
Article 2 – paragraph 1 – point e
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is actingoffering goods or services, including through any person acting in his or her name or on his or her behalf, for purposes directly relating to his or her trade, business, craft or profession;
Amendment 684 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘allegedly illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not isubject to allegations of non compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 691 #
Proposal for a regulation
Article 2 – paragraph 1 – point g a (new)
Article 2 – paragraph 1 – point g a (new)
(ga) ‘manifestly illegal content’ means any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State;
Amendment 712 #
Proposal for a regulation
Article 2 – paragraph 1 – point i
Article 2 – paragraph 1 – point i
(i) ‘dissemination to the public’ means making information availaccessible, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties;
Amendment 715 #
Proposal for a regulation
Article 2 – paragraph 1 – point k
Article 2 – paragraph 1 – point k
(k) ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications which enables recipients of the service to access and interact with the relevant intermediary service;
Amendment 721 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information;
Amendment 722 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, prioritise or rank in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 726 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
Article 2 – paragraph 1 – point p
Amendment 737 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) ‘dark pattern’ means an online interface or apart thereof that via its structure, design or functionality subverts or impairs the autonomy, decision- making, preferences or choice of recipients of the service.
Amendment 746 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Amendment 769 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
Amendment 783 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Amendment 797 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
No provision of this Regulation shall prevent providers of intermediary services from offering end-to-end encrypted services, or make the provision of such services a cause for liability or loss of immunity.
Amendment 802 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against aone or more specific items of illegal content, issued by the relevant national judicial or administrative authoritieauthority, or against an offer of illegal goods or services issued by the relevant national administrative or judicial authorities, through trusted and secure communication channels, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 808 #
Proposal for a regulation
Article 8 – paragraph 1 a (new)
Article 8 – paragraph 1 a (new)
1a. Individuals shall have the right to report allegedly illegal content or to mandate a body, organisation or association referred to in Article 68 to report such content to the competent authorities in their country of residence, which shall expeditiously make a ruling. Where the content is deemed illegal under the national law of the country of residence of the individual, or manifestly illegal under Union law, this shall be reported to the platform for immediate enforcement on the territory of the Member State issuing the order and to the competent authorities for assessment under national law.
Amendment 816 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
Article 8 – paragraph 2 – point a – indent 1
— a sufficiently detailed statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed;
Amendment 820 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2
Article 8 – paragraph 2 – point a – indent 2
— one or more exact uniform resource locatorsa clear indication of the exact electronic location of that information, such as the exact URL or URLs where appropriate and, where necessary, additional information enabling the identification of the illegal content concerned;
Amendment 825 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
Article 8 – paragraph 2 – point a – indent 3
— information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content, including deadlines for appeal;
Amendment 827 #
Proposal for a regulation
Article 8 – paragraph 2 – point b
Article 8 – paragraph 2 – point b
(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective and does not lead to the removal of content that is legal in another Member State;
Amendment 831 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
Article 8 – paragraph 2 – point b a (new)
(ba) the territorial scope of an order addressed to a provider that has its main establishment in another Member State is limited to the territory of the Member State issuing the order;
Amendment 832 #
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
Article 8 – paragraph 2 – point b b (new)
(bb) the territorial scope of an order addressed to a provider or its representative that has its main establishment outside the Union, where Union law is infringed, is limited to the territory of the Union or, where national law is infringed, to the territory of the Member State issuing the order;
Amendment 853 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
Article 8 – paragraph 4 a (new)
4a. Providers of intermediary services may refuse to execute an order referred to in paragraph 1 if it contains manifest errors or does not contain sufficient information as referred to in paragraph 2. Providers shall inform the competent authority without undue delay, asking for the necessary clarification. It may submit an appeal to the Digital Services Coordinator of establishment where it feels that the territorial scope of the order is disproportionate.
Amendment 855 #
Proposal for a regulation
Article 8 – paragraph 4 b (new)
Article 8 – paragraph 4 b (new)
4b. Member States shall ensure that the judicial authorities may, at the request of an applicant whose personality rights are infringed by illegal content, issue against the relevant provider of hosting services an order in accordance with this Article to remove or disable access to this content, including by way of an interlocutory injunction.
Amendment 860 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial orauthorities, or regarding offers of illegal goods or services issued by administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order via trusted and secure communications channels.
Amendment 869 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
Article 9 – paragraph 2 – point a – indent -1 (new)
— a clear indication of the exact electronic location, an account name or a unique identifier of the recipient on whom information is sought;
Amendment 874 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
Article 9 – paragraph 2 – point a – indent 2
— information about legal redress available to the provider and to the recipients of the service concerned including deadlines for appeal, and ensure that they can be exercised effectively;
Amendment 875 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 a (new)
Article 9 – paragraph 2 – point a – indent 2 a (new)
— whether the provider may swiftly inform the recipient of the service concerned.
Amendment 883 #
Proposal for a regulation
Article 9 – paragraph 2 a (new)
Article 9 – paragraph 2 a (new)
2a. The provider of the service shall inform the recipient of the service whose data is being sought without undue delay.
Amendment 889 #
Proposal for a regulation
Article 9 a (new)
Article 9 a (new)
Article 9a Effective remedies for consumers 1. Recipients of the service whose content was removed according to Article 8 or whose information was sought according to Article 9 shall have the right to effective remedies against such orders, without prejudice to remedies available under Directive (EU) 2016/680 and Regulation(EU) 2016/679. 2. Such right to an effective remedy shall be exercised before a court in the issuing Member State in accordance with national law and shall include the possibility to challenge the legality of the measure, including its necessity and proportionality. 3. Digital Services Coordinators shall publish a ‘toolbox’ of complaint and redress mechanisms applicable in their respective territory, in at least one of the official languages of the Member State where they operate.
Amendment 890 #
Proposal for a regulation
Article 9 b (new)
Article 9 b (new)
Article 9b Where the issuing authority is subject to a procedure under Article 7(1) or 7(2) of the Treaty on European Union, the provider of intermediary services shall act upon the order or transmit the requested data only after receiving the explicit written approval of the Digital Services Coordinator of establishment.
Amendment 895 #
Proposal for a regulation
Article 9 a (new)
Article 9 a (new)
Article 9a Exclusion for micro enterprises and not- for-profit services This Chapter shall not apply to online platforms that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC or as a not-for-profit service with fewer than 100,000 monthly active users.
Amendment 900 #
Proposal for a regulation
Article 10 – paragraph 1 a (new)
Article 10 – paragraph 1 a (new)
1a. Providers of intermediary services shall ensure that recipients of the service, including affected non-users, can communicate with them in a direct, accessible and timely manner and, as necessary, request non-automated responses.
Amendment 929 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions or modifications that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, user- friendly and unambiguous language and shall be publicly available in an easily accessible formatand machine-readable format in the languages in which the service is offered.
Amendment 936 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall publish summary versions of their terms and conditions in clear, user- friendly and unambiguous language, and in an easily accessible and machine- readable format. Such a summary shall include information on remedies and redress mechanisms pursuant to Articles 17 and 18, where available.
Amendment 938 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objectivecoherent, predictable, non- discriminatory, transparent, diligent, non- arbitrary and proportionate manner in applying and enforcing the restrictions and modifications referred to in paragraph 1, with due rin compliance with procedural safeguard tos and in full respect of the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter and relevant national law.
Amendment 955 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Any restriction referred to in paragraph 1 must respect the fundamental rights enshrined in the Charter and relevant national law.
Amendment 957 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2b. Individuals who are enforcing restrictions on the basis of terms and conditions of providers of intermediary services shall be given adequate initial and ongoing training on the applicable laws and international human rights standards, as well as on the action to be taken in case of conflict with the terms and conditions. Such individuals shall be provided with appropriate working conditions, including professional support, qualified psychological assistance and qualified legal advice, where relevant.
Amendment 960 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall notify the recipients of the service of any change to the contract terms and conditions that can affect their rights and provide a user-friendly explanation thereof. The changes shall not be implemented before the expiry of a notice period which is reasonable and proportionate to the nature and extent of the envisaged changes and to their consequences for the recipients of the service. That notice period shall be at least 15 days from the date on which the provider of intermediary services notifies the recipients about the changes. Failure to consent to such changes should not lead to basic services becoming unavailable.
Amendment 978 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish in a standardised and machine-readable format, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:
Amendment 987 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
Article 13 – paragraph 1 – point c
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, including removals, suspensions, demotions or the imposition of other sanctions, categorised by the type of reason and basis for taking those measures;, as well as measures taken to provide training and assistance to members of staff who are engaged in content moderation.
Amendment 990 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
Amendment 1001 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
Amendment 1014 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 1024 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means. These mechanisms shall be close to the content in question and located on the same level in the online interface as, and clearly distinguishable from, where applicable, mechanisms for notification of alleged violations of terms and conditions. The Commission shall adopt delegated acts in accordance with Article 69 to lay down specific requirements regarding the mechanisms referred to in paragraph 1.
Amendment 1034 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator canmay, in some cases, identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of valid notices containing all of the following elements:
Amendment 1041 #
Proposal for a regulation
Article 14 – paragraph 2 – point a a (new)
Article 14 – paragraph 2 – point a a (new)
(aa) evidence that substantiates the claim, where possible;
Amendment 1045 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the exact electronic location of that information, in particular the exact URL or URLssuch as the URL or URLs or other identifiers where appropriate, and, where necessary, additional information enabling the identification of the alleged illegal content;
Amendment 1051 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
Article 14 – paragraph 2 – point c
Amendment 1054 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
Amendment 1065 #
Proposal for a regulation
Article 14 – paragraph 4
Article 14 – paragraph 4
4. WThere the notice contains the name and an electronic mail address of the individual or entity that submitted it, individual or entity that submitted the notice shall be given the option to provide an electronic mail address to enable the provider of hosting services shallto promptly send a confirmation of receipt of the notice to that individual or entity.
Amendment 1066 #
Proposal for a regulation
Article 14 – paragraph 4 a (new)
Article 14 – paragraph 4 a (new)
4a. Where individuals decide to include their contact details in a notice, their anonymity towards the recipient of the service who provided the content shall be ensured, except in cases of alleged violations of personality rights or of intellectual property rights.
Amendment 1068 #
Proposal for a regulation
Article 14 – paragraph 5
Article 14 – paragraph 5
5. The provider shall also, without undue delay, notify that individual or entity of its decisaction in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision.
Amendment 1069 #
Proposal for a regulation
Article 14 – paragraph 5 a (new)
Article 14 – paragraph 5 a (new)
5a. The provider of intermediary services shall also notify the recipient of the service who provided the information, where contact details are available, giving them the opportunity to reply before taking a decision, unless this would obstruct the prevention and prosecution of serious criminal offences.
Amendment 1077 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objectiveact in a timely, diligent, non- discriminatory and non-arbitrary manner. Where they use automated means for that pre-processing notices or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 1085 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Upon receipt of a valid notice, providers of hosting services shall act expeditiously to disable access to content which is manifestly illegal.
Amendment 1086 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
Article 14 – paragraph 6 b (new)
6b. Information that has been the subject of a notice and that is not manifestly illegal shall remain accessible while the assessment of its legality is still pending. Member States shall ensure that providers of intermediary services are not held liable for failure to remove notified information, while the assessment of legality is still pending.
Amendment 1090 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
Article 14 – paragraph 6 c (new)
6c. A decision taken pursuant to a notice submitted in accordance with Article 14(1) shall protect the rights and legitimate interests of all affected parties, in particular their fundamental rights as enshrined in the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue.
Amendment 1091 #
Proposal for a regulation
Article 14 – paragraph 6 d (new)
Article 14 – paragraph 6 d (new)
6d. The provider of hosting services shall ensure that processing of notices is undertaken by qualified individuals to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant professional support, qualified psychological assistance and legal advice.
Amendment 1092 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or to disable access to, or to demote or otherwise impose sanctions against specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall promptly inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. of the action, provide a clear and specific statement of reasons for that action, and include information on the possibility to issue a counter- notice, to make use of the internal complaint-handling system set out in Article 17 and to appeal a decision with the competent authority. This obligation shall not apply and statements of reasons may be withheld where: (a) it is necessary for the investigation, or prosecution, of violations of law or public policy, including for ongoing criminal investigations, to justify avoiding or postponing notice to the recipient; or (b) the content removed were components of high-volume, commercial campaigns to deceive users or manipulate content moderation efforts.
Amendment 1101 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decisaction entails either the removal of, demotion or other sanction against, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of accessaction, including, where a decision was taken pursuant to Article 14, an explanation about why the disabling of access did not exceed what was strictly necessary to achieve its objective;
Amendment 1107 #
Proposal for a regulation
Article 15 – paragraph 2 – point b
Article 15 – paragraph 2 – point b
(b) the facts and circumstances relied on in taking the decisaction, including where relevant whether the decisaction was taken pursuant to a notice of manifestly illegal content submitted in accordance with Article 14 or to an order in accordance with Article 8;
Amendment 1109 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means in taking the decision, including whereinforming the decision was taken in respect of content detected or identified using automated means;
Amendment 1112 #
Proposal for a regulation
Article 15 – paragraph 2 – point d
Article 15 – paragraph 2 – point d
(d) where the decision concerns allegedmanifestly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground;
Amendment 1116 #
Proposal for a regulation
Article 15 – paragraph 2 – point f
Article 15 – paragraph 2 – point f
(f) clear, user-friendly information on the redress possibilities available to the recipient of the service in respect of the decision, in particular through internal complaint- handling mechanisms, out-of- court dispute settlement and judicial redress.
Amendment 1118 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible, machine-readable database managed and published by the Commission. That information shall not contain personal data.
Amendment 1135 #
Proposal for a regulation
Article 16
Article 16
Amendment 1153 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or, demote, disable access to or impose other sanctions against the information;
Amendment 1175 #
Proposal for a regulation
Article 17 – paragraph 2
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendlincluding for persons with disabilities, user-friendly, non- discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner, including for persons with disabilities.
Amendment 1180 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner, non-discriminatory and non- arbitrary manner and within seven days starting on the date on which the online platform received the complaint. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1, without undue delay.
Amendment 1191 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means and are reviewed by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards and to whom appropriate working conditions are provided, including, where relevant, professional support, qualified psychological assistance and legal advice.
Amendment 1199 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to innd organisations mandated under Article 17(1)68, shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article 17. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. Online platforms shall not be liable for implementing decisions of a dispute settlement procedure. The first subparagraph is without prejudice to the right of the recipient concerned to seek redress against the decision before a court in accordance with the applicable law.
Amendment 1210 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – introductory part
Article 18 – paragraph 2 – subparagraph 1 – introductory part
2. The Digital Services Coordinator of the Member State where the independent out-of-court dispute settlement body is established shall, at the request of that body, certify the body for a maximum of three years, which can be renewed, where the body has demonstrated that it meets all of the following conditions:
Amendment 1214 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and its members are remunerated in a way that is not linked to the outcome of the procedure;
Amendment 1216 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a a (new)
Article 18 – paragraph 2 – subparagraph 1 – point a a (new)
(aa) it is composed of legal experts;
Amendment 1218 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point b a (new)
Article 18 – paragraph 2 – subparagraph 1 – point b a (new)
(ba) the natural persons with responsibility for dispute settlement are granted a period of office of a minimum of three years to ensure the independence of their actions;
Amendment 1219 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point b b (new)
Article 18 – paragraph 2 – subparagraph 1 – point b b (new)
(bb) the natural persons with responsibility for dispute settlement commit not to work for the online platform or a professional organisation or business association of which the online platform is a member for a period of three years after their position in the body has ended;
Amendment 1220 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point b c (new)
Article 18 – paragraph 2 – subparagraph 1 – point b c (new)
(bc) natural persons with responsibility for dispute resolution may not have worked for an online platform or a professional organisation or business association of which the online platform is a member for a period of two years before taking up their position in the body;
Amendment 1227 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible including for persons with disabilities through electronic communication technology;
Amendment 1228 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c a (new)
Article 18 – paragraph 2 – subparagraph 1 – point c a (new)
(ca) the anonymity of the individuals involved in the settlement procedure can be guaranteed;
Amendment 1231 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
Article 18 – paragraph 2 – subparagraph 1 – point d
(d) it is capable ofensures the settling of disputes in a swift, efficient and cost-effective manner and in at least one official language of the Union, or at the request of the recipient at least in English;
Amendment 1237 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
Article 18 – paragraph 2 – subparagraph 1 – point e
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure which are easily and publicly accessible.
Amendment 1239 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e a (new)
Article 18 – paragraph 2 – subparagraph 1 – point e a (new)
(ea) it ensures that a preliminary decision is taken within a period of seven days following the reception of the complaint and that the outcome of the dispute settlement is made available within a period of 90 calendar days from the date on which the body has received the complete complaint file.
Amendment 1247 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
Article 18 – paragraph 3 – subparagraph 1
If the body decides the dispute in favour of the recipient of the service or organisations mandated under Article 68, the online platform shall reimburse the recipient or organisation for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient or organisation shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
Amendment 1252 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6a. By 31 December 2024, and every two years thereafter, Digital Services Coordinators shall assess whether the dispute settlement bodies that they have been certified in accordance with paragraph 2 comply with the requirements of this Regulation. Each Digital Services Coordinator shall publish and send to the Agency a report on the development and functioning of those bodies. That report shall in particular: (a) identify best practices of the out- of-court dispute settlement bodies; (b) report on any demonstrable shortcomings, supported by statistics, that hinder the functioning of the out-of-court dispute settlement bodies for both domestic and cross-border disputes, where appropriate; (c) make recommendations on how to improve the effective and efficient functioning of the out-of-court dispute settlement bodies, where appropriate.
Amendment 1260 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delayexpeditiously, taking into account due process. The use of automated notices by trusted flaggers without effective human review shall not be accepted as a valid means of submission.
Amendment 1267 #
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
Article 19 – paragraph 2 – introductory part
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entitiesy, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:
Amendment 1271 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying allegedly illegal content;
Amendment 1275 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform, law enforcement or governmental entity;
Amendment 1280 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent, accurate and objective manner.
Amendment 1282 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
Article 19 – paragraph 2 – point c a (new)
(ca) it publishes, at least once a year, clear, easily comprehensible and detailed reports on all notices submitted in accordance with Article 14 during the relevant period. The report shall list: - notices categorised by the identity of the provider of hosting services; - the type of content notified; - the specific legal provisions allegedly breached by the content notified; - the action taken by the provider; - any potential conflicts of interest and sources of funding, and an explanation of the procedures in place to ensure the trusted flagger maintains its independence.
Amendment 1295 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commissaward the trusted flagger status for periods of three years, upon which the status may be renewed where the trusted flagger concerned continues to meet the requirements of this Regulation, and the Boardshall communicate to the Agency the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2.
Amendment 1299 #
Proposal for a regulation
Article 19 – paragraph 4
Article 19 – paragraph 4
4. The Commission shall publish the information referred to in paragraph 3s3 and 6 in a publicly available database in an easily accessible and machine-readable format and keep the database updated.
Amendment 1304 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a not insignificant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 1310 #
Proposal for a regulation
Article 19 – paragraph 6
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received byfrom third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
Amendment 1315 #
Proposal for a regulation
Article 19 – paragraph 7
Article 19 – paragraph 7
7. The Commission, after consulting the Board,Agency may issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 5 and 6.
Amendment 1318 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. Any prior warning shall provide the recipient of the service with a reasonable amount of time to provide a justification to the online platform to consider that the information to which the suspension relates is not manifestly illegal. Such justifications shall be subject to human review.
Amendment 1328 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and, internal complaints- handling systems and out-of-court dispute settlement bodies referred to in Articles 14, 17 and 178, respectively, by individuals or entities or by complainants that frequentpeatedly submit notices or complaints or initiate dispute settlements that are manifestly unfounded.
Amendment 1337 #
Proposal for a regulation
Article 20 – paragraph 3 – point c
Article 20 – paragraph 3 – point c
(c) the gravity of the misuses and its consequences, in particular on the exercise of fundamental rights, regardless of the absolute numbers or relative proportion;
Amendment 1342 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
Article 20 – paragraph 3 – point d a (new)
(da) the fact that notices and complaints were submitted following the use of an automated content recognition system;
Amendment 1343 #
Proposal for a regulation
Article 20 – paragraph 3 – point d b (new)
Article 20 – paragraph 3 – point d b (new)
(db) any justification provided by the recipient of the service to provide sufficient grounds to consider that the information is not manifestly illegal.
Amendment 1347 #
Proposal for a regulation
Article 20 – paragraph 4
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner with due regard to their obligations under Article 12(2) in particular as regards the applicable fundamental rights of the recipients of the service as enshrined in the Charter, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
Amendment 1370 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained, and has made best efforts to verify the completeness and reliability of, the following information:
Amendment 1410 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources.
Amendment 1421 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2
Article 22 – paragraph 3 – subparagraph 2
Where the trader fails to correct or complete that information swiftly, the online platform shall suspend the provision of its service to the trader until the request is complied with.
Amendment 1445 #
Proposal for a regulation
Article 22 – paragraph 6
Article 22 – paragraph 6
6. The online platform shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the servicpublicly available, in a clear, easily accessible and comprehensible manner.
Amendment 1453 #
Proposal for a regulation
Article 22 – paragraph 7 a (new)
Article 22 – paragraph 7 a (new)
7a. Online platforms facilitating short- term holiday rentals must obtain registration numbers, licencing numbers or an equivalent if such a number is required for the offering of short-term holiday rentals by EU, national or local law and must publish this number in the offer.
Amendment 1464 #
Proposal for a regulation
Article 22 a (new)
Article 22 a (new)
Article 22a Transparency for sustainable consumption Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that it provides consumers in a clear and unambiguous manner and in real time with information on the environmental impact of its products and services, such as the use of sustainable and efficient delivery methods, sustainable and ecological packaging, as well as the environmental costs of returning goods in the event of withdrawal.
Amendment 1467 #
Proposal for a regulation
Article 23 – paragraph 1 – point a a (new)
Article 23 – paragraph 1 – point a a (new)
(aa) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed;
Amendment 1468 #
Proposal for a regulation
Article 23 – paragraph 1 – point a b (new)
Article 23 – paragraph 1 – point a b (new)
(ab) a list of all trusted flaggers and their area of expertise;
Amendment 1471 #
Proposal for a regulation
Article 23 – paragraph 1 – point c
Article 23 – paragraph 1 – point c
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied, including human review.
Amendment 1484 #
Proposal for a regulation
Article 24 – paragraph 1
Article 24 – paragraph 1
Amendment 1518 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Amendment 1521 #
Proposal for a regulation
Article 24 b (new)
Article 24 b (new)
Article 24b Additional obligations for platforms primarily used for the dissemination of user-generated pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure (a) that users who disseminate content have verified themselves through a double opt-in e-mail and cell phone registration; (b) professional human content moderation in line with Article 14 paragraph 6 d (new) and trained to identify image-based sexual abuse, where content having a high probability of being illegal; (c) the accessibility of a qualified notification procedure in the form that additionally to the mechanism referred to in Article14 and respecting the same principles with the exception of paragraph 5 a(new), individuals may notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity; content notified through this procedure is to be considered manifestly illegal in terms of Article 14 paragraph 6 a (new) and to be suspended without undue delay and at latest within 48 hours.
Amendment 1543 #
Proposal for a regulation
Article 25 a (new)
Article 25 a (new)
Article 25a Legal representatives of very large online platforms Very large online platforms shall establish one point of contact in each Member State and ensure that it is accessible for recipients of the service in at least one of the official languages of that Member State.
Amendment 1549 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the design, functioning and use made of their services in the Union. This risk assessment shall be specific to their services and activities, including technology design, value chain and business-model choices, and shall include the following systemic risks:
Amendment 1572 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects forforeseeable impact on the exercise of the fundamental rights, in particular the rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectivelythe Charter;
Amendment 1574 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) the intended use, any malfunctioning or intentional manipulation of their service, including by means of inauthentic usecommercial communications published on the platform that are not marketed, sold or arranged by the platform or automated exploitation of the service, in particular with an actual or foreseeable negative effeimpact on the protection of public health, minors and other categories of vulnerable groups of recipients of the service, civic discourse, or actual or foreseeable effeimpacts related to electoral processes and public security.;
Amendment 1581 #
Proposal for a regulation
Article 26 – paragraph 1 – point c a (new)
Article 26 – paragraph 1 – point c a (new)
(ca) any foreseeable negative societal effect of technology design or business- model choices in relation to systemic risks that represent threats to democracy;
Amendment 1582 #
Proposal for a regulation
Article 26 – paragraph 1 – point c b (new)
Article 26 – paragraph 1 – point c b (new)
(cb) any environmental impact such as electricity and water consumption, heat production and CO2 emissions related to the provision of the service and technical infrastructure or to consumer behaviour modification with a direct environmental impact.
Amendment 1585 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting, targeting and displaying advertisement as well as the underlying data collection, processing and profiling influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of informationcontent that is incompatible with their terms and conditions.
Amendment 1595 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
Article 26 – paragraph 2 a (new)
2a. The outcome of the risk assessment and supporting documents shall be communicated to the Agency and the Digital Services Coordinator of establishment. A summary version of the risk assessment shall be made publicly available in an easily accessible format.
Amendment 1597 #
Proposal for a regulation
Article 26 – paragraph 2 b (new)
Article 26 – paragraph 2 b (new)
2b. Organisations mandated under Article 68 shall have the right to obtain access to the outcome and supporting documents of a risk assessment and to lodge a complaint against its accuracy or completeness with the Digital Services Coordinator of establishment.
Amendment 1605 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonabletransparent, proportionate and effective mitigation measures, tailored to o eliminate, prevent and mitigate the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
Amendment 1608 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, the design, features or functioning of their services, their advertising model or their terms and conditions;
Amendment 1615 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display and targeting of advertisements in association with the service they provide;
Amendment 1616 #
Proposal for a regulation
Article 27 – paragraph 1 – point c
Article 27 – paragraph 1 – point c
(c) reinforcing the internal processes, testing, documentation or supervision of any of their activities in particular as regards detection of systemic risk;
Amendment 1622 #
Proposal for a regulation
Article 27 – paragraph 1 – point e a (new)
Article 27 – paragraph 1 – point e a (new)
(ea) targeted measures aimed at reducing electricity and water consumption, heat production and CO2 emissions related to the provision of the service and technical infrastructure.
Amendment 1624 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1a. Any measure adopted shall respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 1632 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission,Agency shall publish comprehensive reports, once a year, which shall include the following:
Amendment 1634 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Articles 30, 31 and 33;
Amendment 1643 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators,Agency may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the CommissionAgency shall organise public consultations.
Amendment 1651 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, and additionally where requested by the Agency, to independento audits to assess compliance with the following:
Amendment 1656 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; . Audits shall at least be performed on: (i) the clarity, coherence and predictable enforcement of terms of service with particular regard to the applicable fundamental rights as enshrined in the Charter; (ii) the completeness, methodology and consistency of the transparency reporting obligations as set out in Articles 13, 13a, 23, and 30 as well as respect for industry standards on transparency reporting; (iii) accuracy, predictability and clarity of the provider's follow-up for recipients of the service and notice providers to notices of manifestly illegal content and terms of service violations and the accuracy of classification (illegal or terms and conditions violation) of removed information; (iv) internal and third-party complaint handling mechanisms; (v) interaction with trusted flaggers and independent assessment of accuracy, response times, efficiency and whether there are indications of abuse; (vi) diligence with regard to verification of the traceability of traders; (vii) the adequateness and correctness of the risk assessment as set out in Article 26; (viii) the adequateness and effectiveness of the measures taken according to Article 27 to address the risks identified in the risk assessments as set out in Article 26; (ix) the effectiveness of and compliance with codes of conduct. Audits on the subjects mentioned in points (i) to (vii) may be combined where the organisation performing the audits has subject-specific expertise on the subject matters at hand.
Amendment 1666 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are legally and financially independent from the very large online platform concerned;
Amendment 1667 #
Proposal for a regulation
Article 28 – paragraph 2 – point b
Article 28 – paragraph 2 – point b
Amendment 1669 #
Proposal for a regulation
Article 28 – paragraph 2 – point c
Article 28 – paragraph 2 – point c
(c) have proven objectivitybeen recognised and vetted by the Agency on the basis of their proven objectivity, subject-specific expertise and professional ethics, based in particular on adherence to codes of practice or appropriate standards.
Amendment 1671 #
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
Article 28 – paragraph 2 – point c a (new)
(ca) natural persons performing the audits commit not to work for the very large online platform audited or a professional organisation or business association of which the platform is a member for a period of three years after their position in the auditing organisation has ended.
Amendment 1673 #
Proposal for a regulation
Article 28 – paragraph 3 – introductory part
Article 28 – paragraph 3 – introductory part
3. The organisations that perform the audits shall establish an audit report for each audit subject as referred to in point (a) of paragraph 1. The report shall be in writing and include at least the following:
Amendment 1674 #
Proposal for a regulation
Article 28 – paragraph 3 – point b a (new)
Article 28 – paragraph 3 – point b a (new)
(ba) a declaration of interests;
Amendment 1675 #
Proposal for a regulation
Article 28 – paragraph 3 – point d
Article 28 – paragraph 3 – point d
(d) a description of the main findings drawn from the audit and a summary of the main findings;
Amendment 1682 #
Proposal for a regulation
Article 28 – paragraph 4
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
Amendment 1686 #
Proposal for a regulation
Article 28 a (new)
Article 28 a (new)
Amendment 1688 #
Proposal for a regulation
Article 29 – title
Article 29 – title
Recommender systems of very large online platforms
Amendment 1689 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
Amendment 1703 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2a. In addition to the obligations applicable to all online platforms, very large online platforms shall offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems.
Amendment 1705 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
Article 29 – paragraph 2 b (new)
2b. Very large online platforms may only limit access to third-party recommender systems temporarily and in exceptional circumstances, when justified by an obligation under Article 18 of Directive (EU) 2020/0359 and Article 32(1)(c) of Regulation (EU) 2016/679. Such limitations shall be notified within 24 hours to affected third parties and to the Agency. The Agency may require such limitations to be removed or modified where it decides by majority vote they are unnecessary or disproportionate.
Amendment 1706 #
Proposal for a regulation
Article 29 – paragraph 2 c (new)
Article 29 – paragraph 2 c (new)
2c. Very large online platforms shall not make commercial use of any of the data that is generated or received from third parties as a result of interoperability activities for purposes other than enabling those activities. Any processing of personal data related to those activities shall comply with Regulation (EU) 2016/679, in particular Articles 6(1)(a) and 5(1)(c).
Amendment 1717 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available in an easily accessible and comprehensible format and through application programming interfaces a repository containing the information referred to in paragraph 2, until oneseven years after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1727 #
Proposal for a regulation
Article 30 – paragraph 2 – point d
Article 30 – paragraph 2 – point d
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the mainall parameters used for that purpose including any parameters used to exclude particular groups;
Amendment 1729 #
Proposal for a regulation
Article 30 – paragraph 2 – point d a (new)
Article 30 – paragraph 2 – point d a (new)
(da) where it is disclosed, a copy of the content of commercial communications published on the very large online platforms that are not marketed, sold or arranged by the very large online platform, which have through appropriate channels been declared as such to the very large online platform;
Amendment 1731 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached in terms of impressions and engagements of the advertisement and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.
Amendment 1734 #
Proposal for a regulation
Article 30 – paragraph 2 – point e a (new)
Article 30 – paragraph 2 – point e a (new)
(ea) in case of an advertisement removed on the basis of a notice submitted in accordance with Article 14 or an order as set out in Article 8, the information referred to in points (b) to (d) of paragraph 2;
Amendment 1741 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. The online platform shall make reasonable efforts to ensure that the information referred to in paragraph 2 is accurate and complete.
Amendment 1753 #
Proposal for a regulation
Article 31 – paragraph 1
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon theirr an independent enforcement and monitoring unit of the Agency, upon reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
Amendment 1754 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from at least three Digital Services Coordinators of destination, the Digital Services Coordinator of establishment or the CommissionAgency, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers, vetted not-for-profit bodies, organisations or associations or vetted media organisations who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification, mitigation and understanding of systemic risks as set out in Article 26(1) and Article 27(1).
Amendment 1762 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interestindependent from commercial interests, not receive any funding by any of the very large online platforms as defined in Article 25 and disclose all funding sources, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. In order to be vetted, not-for-profit bodies, organisations or associations have to meet the requirements laid down in Article 68, have statutory objectives which are in the public interest, and have expertise related to the fields referred to in Article 26.
Amendment 1769 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the BoardAgency, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers, or not-for-profit bodies, organisations or associations or media organisations can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1777 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of followingit does not have access two reasons:the data.
Amendment 1779 #
Proposal for a regulation
Article 31 – paragraph 6 – point a
Article 31 – paragraph 6 – point a
Amendment 1780 #
Proposal for a regulation
Article 31 – paragraph 6 – point b
Article 31 – paragraph 6 – point b
Amendment 1786 #
Proposal for a regulation
Article 31 – paragraph 7 a (new)
Article 31 – paragraph 7 a (new)
Amendment 1790 #
Proposal for a regulation
Article 31 – paragraph 7 b (new)
Article 31 – paragraph 7 b (new)
7b. The Commission shall issue regulatory guidance for very large online platforms and consult with the European Data Protection Board to facilitate the drafting and implementation of codes of conduct at Union level between very large online platforms and vetted researchers, not-for-profit bodies, organisations or associations or media organisation to appropriate technical and organisational safeguards to be implemented before data can be shared pursuant to paragraphs 1 and 2.
Amendment 1791 #
Proposal for a regulation
Article 31 – paragraph 7 c (new)
Article 31 – paragraph 7 c (new)
7c. Upon completion of the research envisaged in Article 31(2), the vetted researchers, not-for-profit bodies, organisations or associations or media organisations, shall make their research publicly available, while fully respecting the rights and interests of the recipients of the service concerned in compliance with Regulation (EU) 2016/679.
Amendment 1797 #
Proposal for a regulation
Article 33 – paragraph 1
Article 33 – paragraph 1
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months in a standardised, machine-readable and easily accessible format.
Amendment 1801 #
Proposal for a regulation
Article 33 – paragraph 2 – point d a (new)
Article 33 – paragraph 2 – point d a (new)
(da) aggregate numbers for the total views and view rate of content prior to a removal on the basis of orders issued in accordance with Article 8 or content moderation engaged in at the provider’s own initiative and under its terms and conditions.
Amendment 1806 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Amendment 1813 #
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
Article 34 – paragraph 1 – introductory part
1. TWhere necessary to achieve agreed and clearly defined public objectives, the Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies at least for the following:
Amendment 1816 #
Proposal for a regulation
Article 34 – paragraph 1 – point a
Article 34 – paragraph 1 – point a
(a) electronic submission of notices under Article 14 in a manner that permits the logging and, where possible, the automatic publication of all relevant statistical data;
Amendment 1819 #
Proposal for a regulation
Article 34 – paragraph 1 – point b
Article 34 – paragraph 1 – point b
(b) electronic submission of notices by trusted flaggers under Article 19, including, if necessary, through application programming interfaces, and which permit the logging and, where possible, the automatic publication of all relevant statistical data;
Amendment 1820 #
Proposal for a regulation
Article 34 – paragraph 1 – point b a (new)
Article 34 – paragraph 1 – point b a (new)
(ba) terms and criteria for the submission of notices in a diligent manner by trusted flaggers under Article 19;
Amendment 1821 #
Proposal for a regulation
Article 34 – paragraph 1 – point c
Article 34 – paragraph 1 – point c
(c) specific interfaces, including application programming interfaces or other mechanisms, to facilitate compliance with the obligations set out in Articles 30 and 31;
Amendment 1830 #
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
Article 34 – paragraph 1 – point f a (new)
(fa) transparency reporting obligations pursuant to Article 13;
Amendment 1832 #
Proposal for a regulation
Article 34 – paragraph 1 – point f b (new)
Article 34 – paragraph 1 – point f b (new)
(fb) the design of online interfaces regarding inter alia the acceptance of and changes to terms and conditions, settings, advertising practices, recommender systems, and decisions within the content moderation process to prevent dark patterns;
Amendment 1833 #
Proposal for a regulation
Article 34 – paragraph 1 – point f c (new)
Article 34 – paragraph 1 – point f c (new)
(fc) electricity, water and heat consumption, including such consumption caused by artificial intelligence and recommender systems by very large online platforms;
Amendment 1834 #
Proposal for a regulation
Article 34 – paragraph 1 – point f d (new)
Article 34 – paragraph 1 – point f d (new)
(fd) data sufficiency, aiming at the reduction of data generation, in particular traffic data, including the reduction of associated electricity, water and heat consumption and resources from data centres.
Amendment 1841 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
2a. At least with regard to points (a), (b) and (ba new) of paragraph 1, the Commission shall carry out thorough impact assessments before implementation in order to ensure compliance with Union law. In particular, such mechanisms shall not lead to restrictions being automatically imposed on notified content.
Amendment 1845 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the BoardAgency shall encourage and facilitate the drawfting upand implementation of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges ofand responsibilities involved in comprehensively tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. Particular attention shall be given to avoiding counterproductive effects on competition, data access and security, the general monitoring prohibition and the rights of individuals. The Commission and the Agency shall approve and be party to any such code of conduct, in order to ensure adequate accountability and legal redress for individuals.
Amendment 1859 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1862 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
Amendment 1869 #
Proposal for a regulation
Article 35 – paragraph 4
Article 35 – paragraph 4
4. The Commission and the BoardAgency shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate, at least once a year, the achievement of their objectives and include at least the following points: (a) the evolution of the scale and nature of the public policy problem being addressed by the relevant code. (b) the existence or emergence of commercial interests on the part of the online platform that may disincentivise the successful implementation of the code; (c) whether there are adequate safeguards to ensure the rights of individuals and businesses. They shall publish their conclusions.
Amendment 1875 #
Proposal for a regulation
Article 35 – paragraph 5
Article 35 – paragraph 5
5. The BoardAgency shall regularly monitor and evaluate, at least once a year, the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.
Amendment 1877 #
Proposal for a regulation
Article 35 – paragraph 5 a (new)
Article 35 – paragraph 5 a (new)
5a. For each Code of Conduct a European Citizens’ Assembly is established that monitors outcomes of the Codes of Conduct, discusses the main issues at stake publicly and sets out public policy recommendations to the Commission. The members of the European Citizens’ Assemblies shall be randomly selected so as to be broadly representative of European society elected taking into account gender, age, location, and social class.
Amendment 1885 #
Proposal for a regulation
Article 36 – paragraph 2 – point a
Article 36 – paragraph 2 – point a
(a) the transmission of information held by providers of online advertising intermediaries to recipients of the service with regard to requirements set in points (b) and (c) of Article 24a new;
Amendment 1886 #
Proposal for a regulation
Article 36 – paragraph 2 – point b
Article 36 – paragraph 2 – point b
(b) the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30, in particular the information referred to in points (d) and (d a new) of paragraph 2 of Article 30..