210 Amendments of Christel SCHALDEMOSE related to 2020/0361(COD)
Amendment 96 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and, the freedom to conduct a business, andprivacy and personal data protection, the right to non-discrimination and access to justice.
Amendment 112 #
Proposal for a regulation
Recital 10
Recital 10
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , Directive 2013/11/EC of the European Parliament and of the Council, Directive 2006/123/EC of the European Parliament and of the Council, and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions. _________________ 30Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). 31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). 32Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37. 33Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. 34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 35Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. 36Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. 37Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules 38Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
Amendment 124 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating toand cover illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relateis not in compliance with Union law since it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 126 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation should apply in respect of issues that are not or not fully addressed by those other acts as well as isnd should be without prejudice to the Member States’ possibility to adopt and further develop laws, regulations and other measures on, which those other acts leave Member States the possibility of adopting certain measures at national leveserve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media or opinion and cultural or linguistic diversity. In particular, in the event of a conflict between Directive 2010/13/EU and the present Regulation, the provisions of the Directive 2010/13/EU should prevail. Similarly, legislation that is in accordance with Directive 2010/13/EU at national level, aiming at securing and fostering the fulfilment of various objectives underpinning the audiovisual policy of the Union and its Member States, should also prevail. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 129 #
Proposal for a regulation
Recital 9 a (new)
Recital 9 a (new)
(9 a) The right of the Member States to provide additional obligations, exemptions or derogations, which serve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media or opinion and cultural or linguistic diversity, should remain unaffected. Because of the convergence of media, legislation and other measures that ensure and promote media pluralism may be necessary for the entire online environment. The right of the Member States especially includes substantive rules, rules of procedure and enforcement rules, including the regulatory structure.
Amendment 131 #
Proposal for a regulation
Recital 9 b (new)
Recital 9 b (new)
(9 b) Respecting the Union’s subsidiary competence to take cultural aspects into account in its action according to Article 167, Paragraph 4 of the Treaty, this Regulation should not affect Member States’ competences in their respective cultural policies, nor should it prejudice national measures addressed to intermediary service providers in order to protect the freedom of expression and information, media freedom and to foster media pluralism as well as cultural and linguistic diversity.
Amendment 149 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the Charter of Fundamental Rights of the European Union, including the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 152 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and, automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider or where the intermediary service provider optimises or promotes content considered as legal, regardless of whether this process is automated.
Amendment 157 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 158 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner and accompanied by additional safeguards. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 168 #
Proposal for a regulation
Recital 28 a (new)
Recital 28 a (new)
(28 a) Since media service providers and publishers of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790 hold editorial responsibility for the content and services they make available, such content and services should benefit from a specific regime that prevents a multiple control of those content and services. Those content and services are typically offered in accordance with professional and journalistic standards as well as legislation and are already subject to systems of supervision and control, often enshrined in commonly accepted self- regulatory standards and codes. In addition, they usually have in place complaints handling mechanisms to resolve content-related disputes. Editorial responsibility means the exercise of effective control both over the selection of content and over its provision by means of its presentation, composition and organisation. Editorial responsibility does not necessarily imply any legal liability under national law for the content or the services provided. Intermediary service providers should refrain from removing, suspending or disabling access to any such content or services, and should be exempt from liability for such content and services. A presumption of legality should exist in relation to the content and services provided by media service providers publishers of press publications who carry out their activities in respect of European values and fundamental rights. Compliance by media service providers with these rules and regulations should be overseen by the respective independent regulatory authorities, bodies or both and the respective European networks they are organised in.
Amendment 168 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content or as an obligation to use automated content- filtering tools.
Amendment 188 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights and freedoms guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, a high level of consumer protection and the right to non- discrimination.
Amendment 188 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant and up-to- date information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
Amendment 190 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. In particular, it is important to ensure that terms and conditions are fair, non- discriminatory and transparent, and are drafted in a clear and unambiguous language in line with applicable Union law. The terms and conditions should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, human review, the legal consequences to be faced by the users for knowingly storing or uploading illegal content as well as on the right to terminate the use of the service. Providers of intermediary services should also provide recipients of services with a concise and easily readable summary of the main elements of the terms and conditions, including the remedies available.
Amendment 195 #
Proposal for a regulation
Recital 4 a (new)
Recital 4 a (new)
(4a) The United Nations Convention on the Rights of Persons with Disabilities (UN CRPD) requires its Parties to take appropriate measures to ensure that persons with disabilities have access, on an equal basis with others, to information and communications technologies and systems, and other facilities and services open or provided to the public, both in urban and in rural areas. The UNCRPD further states that the strict application of universal design to all new goods, products, facilities, technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity. Given the ever-growing importance of digital services in private and public life, in line with the obligations enshrined in the UN CRPD, the Union must ensure a regulatory framework for digital services which protects rights of all recipients of services, including persons with disabilities.
Amendment 201 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 202 #
Proposal for a regulation
Recital 6 a (new)
Recital 6 a (new)
(6a) The notions of ‘access’ or ‘accessibility’ are often referred to with the meaning of affordability (financial access), availability, or in relation to access to data, use of network, etc. It is important to distinguish these from ‘accessibility for persons with disabilities’ which means that services, technologies and products are perceivable, operable, understandable and robust for persons with disabilities.
Amendment 212 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation should apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national levend should be without prejudice to the Member States’ possibility to adopt and further develop laws, regulations and other measures, which serve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media or opinion and cultural or linguistic diversity. In particular, in the event of a conflict between Directive 2010/13/EU and the present Regulation, the provisions of the Directive 2010/13/EU should prevail. Similarly, legislation that is in accordance with Directive 2010/13/EU at national level, aiming at securing and fostering the fulfilment of various objectives underpinning the audiovisual policy of the Union and its Member States, should also prevail. __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 217 #
Proposal for a regulation
Recital 9 a (new)
Recital 9 a (new)
(9a) The right of the Member States to provide additional obligations, exemptions or derogations, which serve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media or opinion and cultural or linguistic diversity, should remain unaffected. Because of the convergence of media, legislation and other measures that ensure and promote media pluralism may be necessary for the entire online environment. The right of the Member States especially includes substantive rules, rules of procedure and enforcement rules, including the regulatory structure.
Amendment 218 #
Proposal for a regulation
Recital 9 b (new)
Recital 9 b (new)
(9b) Respecting the Union’s subsidiary competence to take cultural aspects into account in its action according to Article 167( 4) of the Treaty on the Functioning of the European Union, this Regulation should not affect Member States’ competences in their respective cultural policies, nor should it prejudice national measures addressed to intermediary service providers in order to protect the freedom of expression and information, media freedom and to foster media pluralism as well as cultural and linguistic diversity.
Amendment 221 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. Dispute resolution proceedings should be concluded within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 229 #
Proposal for a regulation
Recital 58 a (new)
Recital 58 a (new)
(58 a) Mitigation of risks, which would lead to removal, disabling access to or otherwise interfering with content and services for which a media service provider holds editorial responsibility, should not be considered reasonable or proportionate.
Amendment 233 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law where that is consistentin conformity with Union law and what the precise nature or subject matter is of the law in question.
Amendment 245 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Consequently, providers of services, such as cloud infrastructure, which are provided at the request of parties other than the content providers and only indirectly benefit the latter, should not be covered by this Regulation. This Regulation should cover, for example, providers of social media, video, image and audio-sharing services, as well as file-sharing services and other cloud services, insofar as those services are used to make the stored information available to the public at the direct request of the content provider. Where a service provider offers services other than hosting, this Regulation should apply only to the services that fall within its scope. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 246 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end. Given the significant risks that arise from targeted advertising, including wthen this is based on profiling amplification of illegal or harmful content and other risks associated with the reliance on pervasive tracking and data mining, targeting of advertising based on personal data should be prohibited. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it. In this context, it is important to highlight that consent to targeted advertising should not be considered as freely given, specific and thus valid if access to the service is made conditional on processing of personal data and profiling techniques outside of the control of the user. This Regulation is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 249 #
Proposal for a regulation
Recital 73 a (new)
Recital 73 a (new)
Amendment 250 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Consequently, providers of services, such as cloud infrastructure, which are provided at the request of parties other than the content providers and only indirectly benefit the latter, should not be covered by the definition of online platforms. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 260 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
Article 1 – paragraph 5 a (new)
5 a. This Regulation shall not affect the possibility of Member States to adopt new legislation as well as to take regulatory measures, especially with regard to intermediary service providers that serve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media and opinion or of cultural and linguistic diversity.
Amendment 271 #
Proposal for a regulation
Recital 21
Recital 21
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not select, rank or modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.
Amendment 271 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(q a) “media service provider” means the natural or legal person who has editorial responsibility for the content and services they offer, determines the manner in which it is organised, and complies with specific provisions or an audiovisual media service provider within the meaning of Article 1 paragraph 1(a) of Directive 2010/13/EU;
Amendment 272 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content taking into account the potential harm the illegal content in question may create. In order to ensure a harmonised implementation of illegal content removal throughout the Union, the provider should, within 24 hours, remove or disable access to illegal content that can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety. According to the well-established case-law of the Court of Justice and in line with Directive 2000/31/EC, the concept of ‘public policy’ involves a genuine, present and sufficiently serious threat which affects one of the fundamental interest of society, in particular for the prevention, investigation, detection and prosecution of criminal offences, including the protection of minors and the fight against any incitement to hatred on grounds of race, sex, religion or nationality, and violations of human dignity concerning individual persons. The concept of ‘public security’ as interpreted by the Court of Justice covers both the internal security of a Member State, which may be affected by, inter alia, a direct threat and physical security of the population of the Member State concerned, and the external security, which may be affected by, inter alia, the risk of a serous disturbance to the foreign relations of that Member State of to the peaceful coexistence of nations. Where the illegal content does not seriously harm public policy, public security, public health or consumers’ health or safety, the provider should remove or disable access to illegal content within seven days. The deadlines referred to in this Regulation should be without prejudice to specific deadlines set out Union law or within administrative or judicial orders. The provider may derogate from the deadlines referred to in this Regulation on the grounds of force majeure or for justifiable technical or operational reasons but it should be required to inform the competent authorities as provided for in this Regulation. The removal or disabling of access should be undertaken in the observance of the principle ofthe Charter of Fundamental Rights, including a high level of consumer protection and freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 274 #
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including at least one default options that areis not based on profiling of the recipient and alternative, third-party recommender systems where technically possible.
Amendment 275 #
Proposal for a regulation
Article 6 – paragraph 1 – subparagraph 1 (new)
Article 6 – paragraph 1 – subparagraph 1 (new)
Measures taken pursuant to paragraph 1 shall be effective, proportionate, specific, targeted and in accordance with the Charter.
Amendment 279 #
Proposal for a regulation
Article 7 a (new)
Article 7 a (new)
Article 7 a Prohibition of interference with content and services offered by media service providers and press publishers 1. Intermediary service providers shall not remove, disable access to or otherwise interfere with content and services made available by media service providers, who hold the editorial responsibility and comply with provisions consistent with EU and national law or by publishers of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790. Publishers' and media service providers’ accounts shall not be suspended on the grounds of legal content and services they offer. 2. This Article shall not affect the possibility for an independent judicial or administrative authority of requiring the media service provider to terminate or prevent an infringement of applicable Union or national law.
Amendment 290 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65 a) Recipients of a service are often locked in to existing platforms due to network effects, which significantly limits user choice. In order to facilitate free choice of recipients between different services, it is therefore important to consider interoperability for industry- standard features of very large online platforms, such as core messaging functionality or image-sharing services. Such interoperability would empower recipients to choose a service based on its functionality and features such as security, privacy, and data processing standards, rather than its existing user base.
Amendment 292 #
Proposal for a regulation
Recital 66
Recital 66
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate.
Amendment 302 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent mannerwith the appropriate safeguards against over-removal of legal content. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 302 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to national and Union law, the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service, in particular the freedom of expression and information, as enshrined in the Charter.
Amendment 305 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2 a. Terms and conditions, or specific provisions thereof, community standards or any other internal guidelines or tools implemented by an intermediary service provider shall not be applied contrary to Article 7a.
Amendment 307 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2 b. Intermediary service providers shall notify media service providers and publishers of press publications pursuant to Article 7a beforehand of any proposed changes to their general terms and conditions and to their parameters or algorithms that might affect the organisation, presentation and display of content and services. The proposed changes shall not be implemented before the expiry of a notice period that is reasonable and proportionate to the nature and extent of the proposed changes and their impact on media service providers and their contents and services.That period shall begin on the date on which the online intermediary service provider notifies the media service providers of the proposed changes. The provision of new content and services on the intermediary services before the expiry of the notice period by a media service provider shall not be considered as a conclusive or affirmative action, given that such content is of particular importance for the exercise of fundamental rights, in particular the freedom of expression and information.
Amendment 320 #
Proposal for a regulation
Recital 28 a (new)
Recital 28 a (new)
(28a) Since media service providers hold editorial responsibility for the content and services they make available, such content and services should benefit from a specific regime that prevents a multiple control of those content and services. Those content and services are typically offered in accordance with professional and journalistic standards as well as legislation and are already subject to systems of supervision and control, often enshrined in commonly accepted self- regulatory standards and codes. In addition, media service providers usually have in place complaints handling mechanisms to resolve content-related disputes. Editorial responsibility means the exercise of effective control both over the selection of content and over its provision by means of its presentation, composition and organisation. Editorial responsibility does not necessarily imply any legal liability under national law for the content or the services provided. Intermediary service providers should refrain from removing, suspending or disabling access to any such content or services. Intermediary service providers should be exempt from liability for content and services offered by media service providers. A presumption of legality should exist in relation to the content and services provided by media service providers who carry out their activities in respect of European values and fundamental rights. Compliance by media service providers with these rules and regulations should be overseen by the respective independent regulatory authorities, bodies or both and the respective European networks they are organised in.
Amendment 322 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13 a Display of the identity of traders Intermediary service providers shall ensure that the identity, such as the trademark or logo or other characteristic traits, of the provider providing content, goods or services on the intermediary services is clearly visible alongside the content, goods or services offered.
Amendment 341 #
Proposal for a regulation
Recital 33
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
Amendment 343 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market, and to ensure a safe and transparent online environment and a high level of consumer protection, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, security and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 360 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. In addition, recipients of intermediary services should be able to hold the legal representative liable for non-compliance.
Amendment 365 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(i a) Directive 2006/123/EC
Amendment 383 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 402 #
Proposal for a regulation
Article 2 – paragraph 1 – point q
Article 2 – paragraph 1 – point q
(q) ‘terms and conditions’ means all terms and conditions or specifications provided by the provider of intermediary services, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.
Amendment 412 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union and shall submit a report of that risk assessment to the national competent authority of the Member State in which their legal representative is established. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 415 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for human dignity, private and family life, freedom of expression and information including the freedom and pluralism of the media, freedom of the art and science and the right to education, the prohibition of discrimination and the rights of the child, as enshrined in Articles 1, 7, 11, 13, 14, 21 and 24 of the Charter respectively;
Amendment 429 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13 a Online advertising transparency Providers of intermediary services that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, concise and unambiguous manner and in real time: (a) that the information displayed on the interface or parts thereof is an online advertisement, including through prominent and harmonised marking; (b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement; (c) clear, meaningful and uniform information about the parameters used to determine the recipient to whom the advertisement is displayed; and (e) if the advertisement was displayed using an automated tool and the identity of the person responsible for that tool. 2. The Commission shall adopt an implementing act establishing harmonised specifications for the marking referred to in paragraph 1(a)of this Article. 3. Providers of intermediary services shall inform the natural or legal person on whose behalf the advertisement is displayed where the advertisement has been displayed. They shall also inform public authorities, upon their request. 4. Providers of intermediary services that display advertising on their online interfaces shall be able to give easy access to public authorities, NGOs, and researchers, upon their request, to information related to direct and indirect payments or any other remuneration received to display the corresponding advertisement on their online interfaces.
Amendment 431 #
Proposal for a regulation
Article 13 b (new)
Article 13 b (new)
Article 13 b Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of showing digital advertising to recipients of their service, of other information society services, or directly to the public. 2. Providers of intermediary services may show targeted digital advertising based on contextual information. 3. The use of the contextual information referred to in paragraph 2 shall be permissible only if it does not allow for the direct or indirect identification of a natural person, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
Amendment 432 #
Proposal for a regulation
Article 13 c (new)
Article 13 c (new)
Article 13 c Recipients’ consent for advertising practices 1. Providers of intermediary services shall not, by default, subject the recipients of their services to targeted, micro-targeted and behavioural advertisement, unless the recipient of the service has expressed a freely given, specific, informed and unambiguous consent to receiving such advertising. Providers of intermediary services shall ensure that recipients of services can easily make an informed choice when expressing their consent by providing them with meaningful information about the use of their personal data. 2. When processing personal data for targeted, micro-targeted and behavioural advertising, where consent has been received, online intermediaries shall comply with relevant Union law and shall not engage in activities that can lead to pervasive tracking, such as disproportionate combination of data collected by platforms, or disproportionate processing of special categories of personal data. 3. Providers of intermediary services shall organise their online interface in a way that provides clear information regarding the advertising parameters and allows the recipients of services to easily and efficiently access and modify those advertising parameters. Providers of intermediary services shall regularly monitor the use of advertising parameters by the recipients of services and make improvements to their use where necessary.
Amendment 438 #
Proposal for a regulation
Article 6 – paragraph 1 c (new)
Article 6 – paragraph 1 c (new)
Providers of intermediary services shall ensure that such measures are accompanied by appropriate safeguards, such as human oversight, documentation, traceability, transparency of algorithms used or additional measures to ensure the accuracy, fairness, transparency and non- discrimination of voluntary own-initiative investigations.
Amendment 447 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
(50a) After having obtained the necessary contact information of a trader, which are aimed at ensuring consumer rights, a provider of intermediary services needs to verify that these details are consistently being updated and accessible for consumers. Therefore, it shall conduct regular and randomized checks on the information provided by the traders on its platform. To ensure a consistent display of these contact information, intermediary services should establish mandatory designs for the inclusion of these contact information. A content, good or service shall only be displayed after all necessary information are made available by the business user.
Amendment 452 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
Article 8 – paragraph 2 – point a – indent 1
— a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed with due regard to fundamental rights of the recipient of the service concerned;
Amendment 454 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
Article 8 – paragraph 2 – point a – indent 1 a (new)
- identification of the competent judicial or administrative authority;
Amendment 455 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 b (new)
Article 8 – paragraph 2 – point a – indent 1 b (new)
- reference to the legal basis for the order;
Amendment 476 #
Proposal for a regulation
Article 38 a (new)
Article 38 a (new)
Article 38 a Relation to sector-specific provisions The application of these provisions does not affect areas that are subject to sector- specific regulation and provisions. In these areas, the responsibility for enforcing the provisions lies with the competent national authorities, which are organised in European networks. Within these networks, the competent authorities shall establish suitable procedures that allow for effective coordination and consistent application and enforcement of this Regulation.
Amendment 480 #
Proposal for a regulation
Article 48 – paragraph 1
Article 48 – paragraph 1
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator, notably representatives of European regulatory networks of independent national regulatory authorities, bodies or both, shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them.
Amendment 481 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including unsafe, counterfeit or non-compliant products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 488 #
Proposal for a regulation
Recital 58 a (new)
Recital 58 a (new)
(58a) Mitigation of risks, which would lead to removal, disabling access to or otherwise interfering with media services and content for which a media service provider holds editorial responsibility, should not be considered reasonable or proportionate.
Amendment 496 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, with due regard to fundamental rights of the recipient of the service concerned, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
Amendment 499 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
Article 9 – paragraph 2 – point a – indent 1 a (new)
- identification of the competent judicial or administrative authority;
Amendment 501 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 b (new)
Article 9 – paragraph 2 – point a – indent 1 b (new)
- reference to the legal basis for the order;
Amendment 508 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65a) Due to their market position, very large online platforms have developed an increasing influence over society’s social, economic, and political interactions. Consumers face a lock-in situation, which may lead them into accepting unfavourable terms and conditions to participate in the services provided by these very large online platforms. To restore a competitive market and to allow consumers more choices, very large online platforms should be required to setup the necessary technical access points to create interoperability for their core services, with a view to allowing competitors a fairer market access and enabling more choice for consumers, while at the same time complying with privacy, security and safety standards. These access points should create interoperability for other online platform services of the same type, without the need to convert digital content or services to ensure functionality.
Amendment 534 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used by the provider of the intermediary service for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format and include a searchable archive of previous versions of the provider’s terms and conditions.
Amendment 536 #
Proposal for a regulation
Recital 73 a (new)
Recital 73 a (new)
(73a) The designation of a Digital Services Coordinator in the Member Stat should be without prejudice to already existing enforcement mechanisms, such as in electronical communication or media regulation, and independent regulatory structures in these fields as defined by European and national law. The competences of the Digital Services Coordinator should not interfere with those of the appointed authorities. For ensuring coordination and for contributing to the effective consistent application and enforcement of this Regulation throughout the Union, the different European networks, in particular the European Regulators Group for Audiovisual Media Services (ERGA) and the Body of European Regulators for Electronic Communications (BEREC), should be responsible. For the effective implementation of this task, these networks should develop suitable procedures to be applied in cases concerning this Regulation.
Amendment 547 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies available.
Amendment 581 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 584 #
Proposal for a regulation
Article 13 b (new)
Article 13 b (new)
Article 13b Online interface design 1. Providers of intermediary services shall refrain from subverting or impairing autonomous decision-making or free choice of a recipient of a service through the design, functioning or operation of online interfaces or a part thereof, such as but not limited to: (a) according visual prominence to one option when asking the recipient of the service for consent or a decision; (b) repeatedly requesting consent to data processing or requesting a change to a setting or configuration of the service after the recipient of the service has already made her choice; (c) making the procedure of cancelling a service more difficult than signing up to it. 2. A choice or decision by the recipient of the service using an online interface that does not comply with the requirements of this article shall not constitute consent in accordance with Regulation (EU) 2016/679. 3. The Commission shall be empowered to publish guidelines indicating specific design choices that qualify as subverting or impairing the autonomy, decision-making processes or choices of the recipient of the service.
Amendment 588 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, clearly visible on the hosting service interface, and allow for the submission of notices exclusively by electronic means and in the language of the individual or entity submitting a notice.
Amendment 589 #
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 597 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particularsuch as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 599 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down harmonised rules on the provision of intermediary services in order to improve the functioning of the internal market. In particular, it establishes:
Amendment 604 #
Proposal for a regulation
Article 1 – paragraph 2 – point a
Article 1 – paragraph 2 – point a
(a) contribute to the proper functioning of the internal market for intermediary services to ensure fair competition;
Amendment 606 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including a high level of consumer protection, are effectively protected.
Amendment 606 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 on the basis of which a diligent economic operator can identify the illegality of the content in question shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
Amendment 612 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 624 #
Proposal for a regulation
Article 1 – paragraph 5 – introductory part
Article 1 – paragraph 5 – introductory part
5. This Regulation is without prejudice toshall and will not affect the rules laid down by the following:
Amendment 626 #
Proposal for a regulation
Article 1 – paragraph 5 – point b
Article 1 – paragraph 5 – point b
(b) Directive (EU) 20108/13/EC808 and Directive (EU) 2019/882;
Amendment 636 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU)2020/1828;
Amendment 640 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU) 2019/882
Amendment 641 #
Proposal for a regulation
Article 1 – paragraph 5 – point i b (new)
Article 1 – paragraph 5 – point i b (new)
(ib) Directive 2013/11/EU
Amendment 642 #
Proposal for a regulation
Article 15 b (new)
Article 15 b (new)
Article 15b Content moderation staff Providers of hosting services shall ensure adequate qualification of staff working on content moderation, including ongoing training on the applicable legislation and fundamental rights. The provider shall also provide appropriate working conditions including the opportunity to seek professional support, qualified psychological assistance and qualified legal advice.
Amendment 643 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
Article 1 – paragraph 5 a (new)
5a. This Regulation shall not affect the possibility of Member States to adopt new legislation as well as to take regulatory measures, especially with regard to intermediary service providers that serve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media and opinion or of cultural and linguistic diversity.
Amendment 645 #
Proposal for a regulation
Article 1 a (new)
Article 1 a (new)
Article 1a No circumvention of the rules set out in this Regulation 1. Any contractual provision between an intermediary service provider and a recipient of its service, between an intermediary service provider and a trader or between a recipient of its service and a trader, which is contrary to this Regulation, is invalid. 2. This Regulation shall apply irrespective of the law applicable to contracts.
Amendment 646 #
Proposal for a regulation
Article 1 a (new)
Article 1 a (new)
Article 1a Objective The aim of this Regulation is to contribute to the proper functioning of the internal market by setting out harmonised rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 651 #
Proposal for a regulation
Article 2 – paragraph 1 – point c
Article 2 – paragraph 1 – point c
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft or profession;
Amendment 656 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions toagainst or in favour of removeal or disableing of access to the information;
Amendment 657 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions toagainst or in favour of suspendsion or terminateion of the provision of the service, in whole or in part, to the recipients;
Amendment 659 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions toagainst or in favour of suspendsion or terminateion of the recipients’ account.
Amendment 663 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) decisions against or in favour of demonetising content provided by the recipients;
Amendment 665 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
Article 17 – paragraph 1 – point c b (new)
(cb) decisions against or in favour of applying additional labels or information to content provided by the recipients;
Amendment 668 #
Proposal for a regulation
Article 17 – paragraph 1 – point c c (new)
Article 17 – paragraph 1 – point c c (new)
(cc) decisions that adversely affect the recipient’s access to significant features of the platform’s regular services;
Amendment 669 #
Proposal for a regulation
Article 17 – paragraph 1 – point c d (new)
Article 17 – paragraph 1 – point c d (new)
(cd) decisions not to act upon a notice.
Amendment 677 #
Proposal for a regulation
Article 17 – paragraph 5 a (new)
Article 17 – paragraph 5 a (new)
5a. Online platforms shall ensure that any relevant information in relation to decisions taken by the internal complaint- handling mechanism is available to recipients of the service for the purpose of seeking redress through an out-of-court dispute settlement body pursuant to Article 18 or before a court.
Amendment 679 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
Amendment 681 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:
Amendment 682 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or thewith a law of a Member State where it is in conformity with Union law, irrespective of the precise subject matter or nature of that law;
Amendment 683 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, that is in conformity with Union law and irrespective of the precise subject matter or nature of that law;
Amendment 684 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) Compliance with the obligations set out in Chapter III;
Amendment 685 #
Proposal for a regulation
Article 28 – paragraph 1 – point a a (new)
Article 28 – paragraph 1 – point a a (new)
(a a) Adequacy of the risk assessment undertaken pursuant to Article 26.1 and the corresponding risk mitigation measures undertaken pursuant to Article 27.1;
Amendment 686 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
(b) Compliance with any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.
Amendment 687 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
(b) any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37and self- or co-regulatory actions that they have undertaken.
Amendment 687 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms, including aspects such as financial resources and personnel;
Amendment 688 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by expert organisations, previously vetted by the Board, which:
Amendment 690 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned as well as from other very large online platforms;
Amendment 691 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are independent from and do not have conflicts of interest with the very large online platform concerned;
Amendment 692 #
Proposal for a regulation
Article 28 – paragraph 2 – point b
Article 28 – paragraph 2 – point b
(b) have provendemonstrated expertise in the area of risk management, technical competence and capabilities, and, where applicable, can demonstrably draw upon expertise in fields related to the risks investigated or related research methodologies;
Amendment 693 #
Proposal for a regulation
Article 28 – paragraph 2 – point c
Article 28 – paragraph 2 – point c
(c) have provendemonstrated objectivity and professional ethics, based in particular on adherence to relevant codes of practice or appropriate standards.
Amendment 694 #
Proposal for a regulation
Article 28 – paragraph 3 – introductory part
Article 28 – paragraph 3 – introductory part
3. The organisations that perform the audits shall establish an meaningful, granular, comprehensive and independent audit report for each audit. The report shall be in writing and include at least the following:
Amendment 696 #
Proposal for a regulation
Article 28 – paragraph 3 – point d
Article 28 – paragraph 3 – point d
(d) a description of the main findings drawn from the audit and a summary of the main findings;
Amendment 696 #
Proposal for a regulation
Article 18 – paragraph 2 – point e
Article 18 – paragraph 2 – point e
(e) the dispute settlement takes place in accordance with clear and fair, fair and publicly available rules of procedure.
Amendment 697 #
Proposal for a regulation
Article 28 – paragraph 3 – point d a (new)
Article 28 – paragraph 3 – point d a (new)
Amendment 697 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1
Article 18 – paragraph 2 – subparagraph 1
The Digital Services Coordinator shall, where applicable, specify in the certificate the particular issues to which the body’s expertise relates and the official language or languages of the Union in which the body is capable of settling disputes, as referred to in points (b) and (d) of the first subparagraph, respectively. Certified out-of-court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time.
Amendment 698 #
Proposal for a regulation
Article 28 – paragraph 3 – point d b (new)
Article 28 – paragraph 3 – point d b (new)
(d b) a description of the third-parties consulted to inform the audit;
Amendment 699 #
Proposal for a regulation
Article 28 – paragraph 3 – point e
Article 28 – paragraph 3 – point e
(e) an audit opinion on whether the very large online platform subject to the audit meaningfully complied with the obligations and with the commitments referred to in paragraph 1, either positive, positive with comments or negative;
Amendment 701 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 2
Article 18 – paragraph 3 – subparagraph 2
Certified out-of-court dispute settlement bodies shall make information on the fees, or the mechanisms used to determine the fees, known to the recipient of the services and the online platform concerned before engaging in the dispute settlementpublicly available.
Amendment 703 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6a. Decisions reached by an out-of- court dispute settlement body shall not be disputable by another out-of-court dispute settlement body and the resolution of a particular dispute may only be discussed in one out-of-court dispute settlement body.
Amendment 704 #
Proposal for a regulation
Article 28 – paragraph 4
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operationalshall ensure auditors have access to all relevant information to perform their duties. Very large online platforms receiving an audit report that contains evidence of wrongdoings shall ensure to apply the recommendations addressed to them with a view to take all the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
Amendment 705 #
Proposal for a regulation
Article 28 – paragraph 4 – subparagraph 1 (new)
Article 28 – paragraph 4 – subparagraph 1 (new)
Auditors shall submit their audit report to the Board at the same time as the very large online platform concerned. Within a reasonable period of time, the Board shall issue recommendations, monitor the implementation of the report and suggest the adoption of sanctions by the competent Digital Service Coordinator when the very large online platform fails to abide by the Regulation.
Amendment 706 #
Proposal for a regulation
Article 28 – paragraph 4 – point 1 (new)
Article 28 – paragraph 4 – point 1 (new)
(1) The Board, after consulting stakeholders and the Commission, shall publish guidelines about how audits should be conducted by the auditors, how they should be implemented by very large online platforms and how authorities will monitor and enforce the Regulation in this regard.
Amendment 707 #
Proposal for a regulation
Article 28 – paragraph 4 – point 2 (new)
Article 28 – paragraph 4 – point 2 (new)
(2) The Board shall publish and regularly update a list of vetted auditors that very large online platforms can resort to. The Board shall publish and regularly review detailed criteria auditors need to meet.
Amendment 724 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner and in full respect of fundamental rights such as the freedom of expression and information.
Amendment 734 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) “media service provider” means the natural or legal person who has editorial responsibility for the content and services they offer, determines the manner in which it is organised, and complies with specific provisions or an audiovisual media service provider within the meaning of Article 1 paragraph 1(a) of Directive 2010/13/EU;
Amendment 736 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated noticesor incorrect notices, or notices violating recipients’ fundamental rights, through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint- handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 743 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
Article 2 – paragraph 1 – point q b (new)
(qb) ‘persons with disabilities’ means persons within the meaning of Article 3 (1) of Directive (EU) 2019/882;
Amendment 765 #
Proposal for a regulation
Article 5 – paragraph 2
Article 5 – paragraph 2
2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority, decisive influence or the control of the provider.
Amendment 769 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33 a Interoperability 1. By 31 December 2024 very large online platforms shall make the main functionalities of their services interoperable with other online platforms to enable cross-platform exchange of information. This obligation shall not limit, hinder or delay their ability to solve security issues. Very large online platforms shall publicly document all application programming interfaces they make available. 2. The Commission shall adopt implementing measures specifying the nature and scope of the obligations set out in paragraph 1.
Amendment 777 #
Proposal for a regulation
Article 5 a (new)
Article 5 a (new)
Amendment 813 #
Proposal for a regulation
Article 23 – paragraph 1 – point a
Article 23 – paragraph 1 – point a
(a) the number of disputes submitted to thecertified out-of-court dispute settlement bodies referred to in Article 18, the outcomes of the dispute settlement and the average time needed for completing the dispute settlement procedures;
Amendment 818 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
Article 8 – paragraph 2 – point a – indent 1 a (new)
— precise indication of the credentials of the relevant national judicial or administrative authority issuing the order and details of the person(s) of contact within the said authority;
Amendment 827 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
Amendment 832 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
2. Online platforms that display advertising on their online interfaces shall include in the reports referred to in Article 13 the following information: (a) the number of advertisements removed, disabled, or labelled by the online platform, accompanied by a justification explaining the grounds for the decision; (b) aggregated data on the provider of the online advertisements that were removed, disabled or labelled by the online platform, including information on the advertisement published, the amount paid for the advertisement and information on the target audience, if applicable.
Amendment 855 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services and activities, such as business model and design decisions, in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 866 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to, including the respect for private and family life, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 893 #
Proposal for a regulation
Chapter III – title
Chapter III – title
Due diligence obligations for a transparent, accessible and safe online environment
Amendment 907 #
Proposal for a regulation
Article 10 a (new)
Article 10 a (new)
Amendment 928 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in theiruse fair, non-discriminatory and transparent contract terms and conditions. T that information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set outshall be drafted in clear and unambiguous language and shall bare publicly available in an easily accessible format in a searchable archive of all the previous versions with their date of application.
Amendment 930 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. Any option based on profiling within the meaning of Article 4 (4) of Regulation (EU) 2016/679 shall never be the default setting of a recommender system.
Amendment 934 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
Amendment 940 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to national and Union law, the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service, in particular the freedom of expression and information, as enshrined in the Charter.
Amendment 942 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2a. Very large online platforms that use recommender systems shall allow the recipient of the service to have information presented to them in a chronological order only and, where technically possible, to use third-party recommender systems. Third-party recommender systems shall have access to the same information available to the recommender systems used by the platform, notwithstanding the platform’s obligations under Regulation (EU) 2016/679. Very large online platforms may only temporarily limit access to third- party recommender systems in case of provable abuse by the third-party provider or when justified by an immediate requirement to address a technical issue such as a serious security vulnerability.
Amendment 948 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces an easily accessible and searchable repository containing the information referred to in paragraph 2, until onfive years after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 951 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
Article 30 – paragraph 2 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
Amendment 956 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2b. Intermediary service providers shall notify media service providers pursuant to article 7a beforehand of any proposed changes to their general terms and conditions and to their parameters or algorithms that might affect the organisation, presentation and display of content and services. The proposed changes shall not be implemented before the expiry of a notice period that is reasonable and proportionate to the nature and extent of the proposed changes and their impact on media service providers and their contents and services. That period shall begin on the date on which the online intermediary service provider notifies the media service providers of the proposed changes. The provision of new content and services on the intermediary services before the expiry of the notice period by a media service provider shall not be considered as a conclusive or affirmative action, given that such content is of particular importance for the exercise of fundamental rights, in particular the freedom of expression and information.
Amendment 956 #
(ea) any decisions by the online platform regarding labelling, removal or disabling of online advertisements, including a justification explaining the grounds for the decision.
Amendment 972 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate in an easily accessible and user-friendly format. This shall include personal data only where it is lawfully accessible by the public and without prejudice to Regulation (EU) 2016/679.
Amendment 975 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests or civil society organisations representing the public interest, be independent from commercial interests, disclose the sources of funding financing their research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 977 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this Regulation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 998 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
Article 13 – paragraph 1 a (new)
1a. Providers of intermediary services shall ensure that the identity, such as the trademark, logo or other characteristic traits, of the business user providing the goods, content or services on the intermediary services is clearly visible alongside the goods, content or services offered.
Amendment 1006 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Amendment 1011 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
2a. Absence of agreement on voluntary industry standards shall not prevent the applicability or implementation of any measures outlined in this regulation.
Amendment 1015 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 1016 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13a Remedies for consumers Consumers harmed by practices contrary to this Regulation shall have access to proportionate and effective remedies, including compensation for damage suffered by consumers and, where relevant, a price reduction or the termination of the contract. Member States may determine the conditions for the application and effects of those remedies. Member States may take into account, where appropriate, the gravity and nature of the illegal practices, the damage suffered by consumers and other relevant circumstances.
Amendment 1017 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13a Display of the identity of traders Intermediary service providers shall ensure that the identity, such as the trademark or logo or other characteristic traits, of the provider providing content, goods or services on the intermediary services is clearly visible alongside the content, goods or services offered.
Amendment 1029 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means, for example through online web forms.
Amendment 1049 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particularsuch as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 1079 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Where the explanation of the reasons as referred to in paragraph 2 (a) does not allow a diligent economic operator to identify the illegality of the content in question; where the notified content is not illegal in the country of establishment of the hosting service; or, where there is a genuine demonstrable doubt about the illegality of the content, the hosting services may seek assistance for further clarification with the relevant authority or the national Digital Services Coordinator;
Amendment 1111 #
Proposal for a regulation
Article 15 – paragraph 2 – point d
Article 15 – paragraph 2 – point d
(d) where the decision concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground; including explanations in relation to the arguments submitted under Article 14, paragraph 2a, where relevant.
Amendment 1132 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 1145 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the provider of the online platform not to act upon the receipt of a notice or against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 1152 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions whether or not to remove or disable access to or restrict visibility of the information;
Amendment 1159 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
Amendment 1163 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions whether or not to suspend or terminate the recipients’ account.
Amendment 1200 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the bodyalways direct recipients to an out-of-court dispute settlement body. The information about the competent out-of-court body shall be easily accessible on the online interface of the online platform in a clear and an user-friendly manner.
Amendment 1205 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 2
Article 18 – paragraph 1 – subparagraph 2
Amendment 1208 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
Article 18 – paragraph 1 a (new)
1a. Online platforms shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 1240 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 2
Article 18 – paragraph 2 – subparagraph 2
The Digital Services Coordinator shall, where applicable, specify in the certificate the particular issues to which the body’s expertise relates and the official language or languages of the Union in which the body is capable of settling disputes, as referred to in points (b) and (d) of the first subparagraph, respectively. The Digital Services Coordinator shall conduct regular checks with the certified bodies to ensure that the out-of-court resolution body comply with the requirements listed under paragraph 2a on an ongoing basis.
Amendment 1243 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
Article 18 – paragraph 2 a (new)
2a. Certified out-of-court dispute settlement bodies shall draw up annual reports listing the number of complaints received annually, the outcomes of the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes.
Amendment 1250 #
Proposal for a regulation
Article 18 – paragraph 3 a (new)
Article 18 – paragraph 3 a (new)
3a. Certified out-of-court dispute settlement bodies shall adopt a decision seeking to resolve the dispute no later than 90 calendar days starting on the date on which the certified body has received the complaint. 3b. Certified out-of-court dispute settlement bodies shall draw up annual reports listing the number of complaints received annually, the outcomes of the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes.
Amendment 1350 #
Proposal for a regulation
Article 20 a (new)
Article 20 a (new)
Article 20a Content of public interest 1. When an online platform takes the decision to remove content or to suspend the provision of its services to a recipient of the service, it shall take into account whether the content is or appears to be specifically intended to contribute to public policy objectives, in particular where the content is of particular importance to public policy, public security or public health objectives at Union or national level. 2. If an online platform decides to remove content or suspend the provision of its services to a user which is or appears to be of public interest, related to public policy, public security or public health the online platform shall take the necessary technical and organisational measures to ensure that complaints through the internal complaint-handling system referred to in Article 17, are processed and decided upon with priority and without delay.
Amendment 1362 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
Article 21 – paragraph 2 a (new)
2a. When a platform that allows consumers to conclude distance contracts with traders becomes aware that a piece of information, a product or service poses a serious risk to the life, health or safety of consumers, it shall promptly inform the competent authorities of the Member State or Member States concerned and provide all relevant information available.
Amendment 1398 #
Proposal for a regulation
Article 22 – paragraph 1 – point f a (new)
Article 22 – paragraph 1 – point f a (new)
(fa) whether the drop shipping principle is applied, i.e. goods are offered that are not in stock in the retailer's warehouse;
Amendment 1402 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, and prior to allowing traders to use its services, make reasonable efforts to assessverify whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliablaragraph 1 is reliable, complete and up-to-date through the use of any freely accessible official online database or online interface made available by a Member States or the Union, by data processed by the online platform, or through requests to the trader to provide supporting documents from reliable sources. Online platforms covered under this Article shall verify the information listed in paragraph 1 from traders that already use their services prior to the entry into force and application of this Regulation.
Amendment 1411 #
Proposal for a regulation
Article 22 – paragraph 2 a (new)
Article 22 – paragraph 2 a (new)
Amendment 1423 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2 a (new)
Article 22 – paragraph 3 – subparagraph 2 a (new)
In addition, the platforms covered under this obligation shall conduct random checks on the products and services traders offer on their online interfaces or parts thereof. These shall include but shall not be limited to regular and meaningful mystery shopping exercises and visual inspections.
Amendment 1431 #
Proposal for a regulation
Article 22 – paragraph 4
Article 22 – paragraph 4
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information, without prejudice to sector-specific legislation with longer storage requirements.
Amendment 1442 #
Proposal for a regulation
Article 22 – paragraph 6
Article 22 – paragraph 6
6. The online platform shall make the information referred to in points (a), (d), (e), (f) and (fg) of paragraph 1 available to the recipients of the service, in a clear, easily accessible in accordance with Directive (EU) 2019/882, in a clear and comprehensible manner.
Amendment 1447 #
Proposal for a regulation
Article 22 – paragraph 6 a (new)
Article 22 – paragraph 6 a (new)
6a. In order to comply with paragraph 1 point (g), web shops shall inform close to the depicted goods if their goods are part of the stock or whether a manufacturer has to be found for them first. Online marketplaces shall provide third party sellers with a dropshipping labelling tool, which they have to use if they want to be approved by the platform.
Amendment 1450 #
Proposal for a regulation
Article 22 – paragraph 7
Article 22 – paragraph 7
7. The online platform shall design and organise its online interface in a way that enables themselves and traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union lawunder applicable Union and Member State law on consumer protection, including on product safety. Traders that do not fulfil their obligations under consumer and product safety legislation should be suspended and, as a last resort, not allowed on the platform. The online platform shall not subvert or impair consumers’ autonomy, decision-making, or choice via the structure, function or manner of operation of their online interface or a part thereof.
Amendment 1452 #
Proposal for a regulation
Article 22 – paragraph 7 a (new)
Article 22 – paragraph 7 a (new)
7a. The online platform may rely on the information provided by third party suppliers referred to in Article 6a point (b) Directive (EU) 2019/2161, unless the platform knows or ought to know, based on the available data regarding transactions on the platform, that this information is incorrect. Online platforms must take adequate measures to prevent traders from appearing on the platform as non-traders.
Amendment 1454 #
Proposal for a regulation
Article 22 – paragraph 7 b (new)
Article 22 – paragraph 7 b (new)
7b. An online platform is liable for damages caused to consumers by a violation of its duties in this Article;
Amendment 1455 #
Proposal for a regulation
Article 22 – paragraph 7 c (new)
Article 22 – paragraph 7 c (new)
7c. The online platform must inform the consumer at the earliest possible point in time and immediately before the distance contract is concluded with a third-party provider in a prominent manner that the consumer is concluding a contract with the third party and not with the online platform. If the online platform violates its duty to provide information, the consumer can also assert the rights and legal remedies arising from the distance contract against the third party for non-performance against the online platform.
Amendment 1456 #
Proposal for a regulation
Article 22 – paragraph 7 d (new)
Article 22 – paragraph 7 d (new)
7d. If an online platform makes misleading information about third-party providers, about goods, services or digital content offered by third-party providers or about other provisions of the distance contract, the online platform is liable for the damage that these misleading information inflicts on consumers;
Amendment 1457 #
Proposal for a regulation
Article 22 – paragraph 7 e (new)
Article 22 – paragraph 7 e (new)
7e. An online platform is liable for guarantees, which it gives about third party supplier or about goods, services or digital content offered by third party supplier.
Amendment 1463 #
Proposal for a regulation
Article 22 a (new)
Article 22 a (new)
Article 22a Duty to protect recipients of the service Operators of online platforms allowing consumers to conclude distance contracts with traders or consumers, or of very large online platforms according to Article 25, who fail to take adequate measures for the protection of the recipients of the service upon obtaining credible evidence of criminal conduct of a recipient of the service to the detriment of other recipients or evidence of the illegality of a certain product, service, commercial practice or advertising method of a third party supplier, shall be held liable for the damages caused resulting from such a failure.
Amendment 1507 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
Online platforms or advertising service providers that play out advertisements shall also check the accuracy of the information about the advertiser in accordance with the due diligence obligations pursuant to Article 22. If there are indications of dubious offers - in the case of obviousness, user reports and web shops "blacklisted" on warning lists - platforms or the advertising service providers behind them may not display the advertising.
Amendment 1520 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Article 24a Right to information 1. Where an online platform becomes aware, irrespective of the means used to, of the illegal nature of a product or service offered through its services, it shall inform those recipients of the service that had acquired such product or contracted such service during the last six months about the illegality, the identity of the trader and any means of redress.
Amendment 1547 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union and shall submit a report of that risk assessment to the national competent authority of the Member State in which their legal representative is established. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 1562 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for human dignity, private and family life, freedom of expression and information including the freedom and pluralism of the media, freedom of the art and science and the right to education, the prohibition of discrimination and the rights of the child, as enshrined in Articles 1, 7, 11, 13, 14, 21 and 24 of the Charter respectively;
Amendment 1570 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, in particular the rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 1708 #
Proposal for a regulation
Article 29 a (new)
Article 29 a (new)
Article 29a Recommendation systems and individual or target-group specific pricing on online market places The description shall also include information on whether users are shown different prices depending on individual, as defined in Article 6 (1) ii) (ea) of Directive 2011/83/EU or target group- specific factors, in particular devices used and geographical locations. Where applicable, the platform shall make reference to these factors in a clearly visible manner.
Amendment 1712 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, functionable and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until onfive year after the advertisement was displayed for the last time on their online interfaces. They shall ensure multi- criterion queries can be performed per advertiser and per all data points present in the advertisement, and provide aggregated data for these queries on the amount spent, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1737 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. The archive must be easily accessible for users and contain a complaint and reporting option for users directly addressed to the platform and the responsible advertising service provider. The requirements for notifications under Art 14 also apply to notifications and complaints about advertising content.
Amendment 1759 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate., and with an easily accessible and user-friendly mechanism to search for multiple criteria, such as those reported in accordance with the obligations set out in Articles 13 and 23
Amendment 1771 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1804 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33a Algorithm accountability 1. When using automated decision- making, the very large online platform shall perform an assessment of the algorithms used. 2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: (a) the compliance with corresponding Union requirements; (b) how the algorithm is used and its impact on the provision of the service; (c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and (d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). 3. When performing its assessment, the very large online platform may seek advice from relevant national public authorities, researchers and non- governmental organisations. 4. Following the assessment, referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points a), b), c) and d) of Paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. 5. Where the very large online platform finds that the algorithm used does not comply with point (a), or (d) of paragraph 2 of this Article, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. 6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c), or (d) of paragraph 2 of this Article, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred into Paragraph 5 of this Article, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement.
Amendment 1809 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33a Interoperability 1. Very large online platforms shall provide, by creating and offering an application programming interface, options enabling the interoperability of their core services to other online platforms. 2. Application programming interfaces should be easy to use, while the processing of personal data shall only be possible in a manner that ensures appropriate security of these data. Measures under paragraph (1) may not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue delay for the provision on interoperability. 3. This Article is without prejudice to any limitations and restrictions set out in Regulation (EU) 2016/679.
Amendment 1811 #
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
Article 34 – paragraph 1 – introductory part
1. The Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies, and whenever available widely-used information and communication technology standards that meet the requirements set out in Annex II of Regulation No. 1025/2012, at least for the following:
Amendment 1822 #
Proposal for a regulation
Article 34 – paragraph 1 – point e
Article 34 – paragraph 1 – point e
(e) interoperability of the advertisement repositories referred to in Article 30(2), and the APIs referred to in Article 33a;
Amendment 1826 #
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
Article 34 – paragraph 1 – point f a (new)
(fa) accessibility of elements and functions of online platforms and digital services for persons with disabilities aiming at consistency and coherence with existing harmonised accessibility requirements when these elements and functions are not already covered by existing harmonised European standards
Amendment 1842 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
2a. The absence of such standards as defined in this article should not prevent the timely implementation of the measures outlined in this regulation.
Amendment 1903 #
Proposal for a regulation
Article 37 – paragraph 4 – point f a (new)
Article 37 – paragraph 4 – point f a (new)
(fa) measures to ensure accessibility for persons with disabilities during implementation of crisis protocols, including by providing accessible description about these protocols
Amendment 1914 #
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 2
Article 38 – paragraph 3 – subparagraph 2
Member States shall make publicly available through online and offline means, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted.
Amendment 1919 #
Proposal for a regulation
Article 38 a (new)
Article 38 a (new)
Article 38a Relation to sector-specific provisions The application of these provisions does not affect areas that are subject to sector- specific regulation and provisions. In these areas, the responsibility for enforcing the provisions lies with the competent national authorities, which are organised in European networks. Within these networks, the competent authorities shall establish suitable procedures that allow for effective coordination and consistent application and enforcement of this Regulation.
Amendment 1941 #
Proposal for a regulation
Article 41 – paragraph 1 – introductory part
Article 41 – paragraph 1 – introductory part
1. Where needed for carrying out their tasks under this Regulation and also in order to avoid any discrepancy in the enforcement of the Digital Services Act, Digital Services Coordinators shall have at least the following powers of investigation, in respect of conduct by providers of intermediary services under the jurisdiction of their Member State:
Amendment 2050 #
Proposal for a regulation
Article 48 – paragraph 1
Article 48 – paragraph 1
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator, notably representatives of European regulatory networks of independent national regulatory authorities, bodies or both, shall participate in the Board . Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them.