Activities of Adriana MALDONADO LÓPEZ related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (continuation of debate)
Amendments (122)
Amendment 75 #
Proposal for a regulation
Recital 4
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, clear, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated.
Amendment 78 #
Proposal for a regulation
Recital 8
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, orctivities or on the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. _________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
Amendment 85 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable Union or national law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 89 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or, online marketplaces or search engines, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
Amendment 98 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and, automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider or where the provider of the service promotes and optimises the content.
Amendment 99 #
Proposal for a regulation
Recital 18 a (new)
Recital 18 a (new)
(18 a) The exemptions from liability should also not be available to providers of intermediary services that do not comply with the due diligence obligations set out in this Regulation. The conditionality should further ensure that the standards to qualify for those exemptions contribute to a high level of safety and trust in the online environment in a manner that promotes a fair balance of the rights of all stakeholders.
Amendment 104 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expressions enshrined in the Charter of Fundamental Rights, including freedom of expression. Where the illegal content can cause significant public harm, the provider should assess and, when necessary, remove or disable access to that content within 24 hours and, in any case, not more than one hour after receiving a removal order from the competent authority. The provider can obtain such actual knowledge or awareness through, in particular, its periodic own- initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 107 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, unless they comply with a number of specific requirements set out in this Regulation, including the appointment of a legal representative in the Union, the implementation of notice and action mechanisms, the traceability of traders using their services, the provision of information on their online advertising and their recommender system practices and policy as well as transparency requirements towards the consumers as laid down in Directive 2011/83/EU. In addition, they should not be able to benefit from the exemption from liability for hosting service providers establishing in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
Amendment 113 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent mannera diligent manner and accompanied by additional safeguards. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union or national law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 116 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or neutral hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks or providers of services deeper in the internet stack, such as IT infrastructure services (on-premise, cloud-based and or hybrid hosting solutions), that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. Services deeper in the internet stack acting as online intermediaries could be required to take proportionate actions where the customer fails to remove the illegal content, unless technically impracticable.
Amendment 120 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact- finding obligation, or as a general obligation forimpeding upon the ability of providers to undertake proactive measures to relation to illegal contentidentify and remove illegal content and to prevent its reappearance.
Amendment 128 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and, by professional entities and by users of services which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
Amendment 130 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. Providers of intermediary services that qualify as small or micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to stablish collective representation under the guidance of the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative.
Amendment 137 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should prevent the reappearance of the notified illegal information. The provider should also inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 140 #
Proposal for a regulation
Recital 43
Recital 43
Amendment 144 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which must ensure human review and meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint- handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner and within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 148 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content and are known to flag content frequently with a high rate of accuracy, that they represent collective interests and that they work in a diligent, objective and objeffective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry representing collective interests and of right- holders specifically created for that purpose could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions and ensure independent public interest representation. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 153 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 162 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned and by other intermediaries, such as advertising services, webhosting, domain name registrations, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 168 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising withat can have both an impact on the equal treatment and opportunities of citizens and on the perpetuation of harmful stereotypes and norms. Therefore, more transparency in online advertising markets and independent research needs to be carried out to assess the effectiveness of behavioural advertisements which could pave the way for stricter measures or restriction of behavioural advertising. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 171 #
Proposal for a regulation
Recital 52 a (new)
Recital 52 a (new)
(52 a) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
Amendment 186 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly and separately present the main parameters for such recommender systems in an clear, concise, accessible and easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient, and shall not make the recipients of their services subject to recommender systems based on profiling by default.
Amendment 188 #
Proposal for a regulation
Recital 63
Recital 63
Amendment 213 #
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
Article 1 – paragraph 2 – point b a (new)
(b a) promote innovation and facilitate competition for digital services, while protecting users and consumers rights.
Amendment 214 #
Proposal for a regulation
Article 1 – paragraph 2 – point b b (new)
Article 1 – paragraph 2 – point b b (new)
(b b) stimulate the level playing field of the online ecosystem by introducing interoperability requirements for very large platforms.
Amendment 218 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(i a) Charter of Fundamental Rights of the European Union
Amendment 221 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
Article 2 – paragraph 1 – point d – introductory part
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union;, or in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as: where the provider targets its activities towards one or more Member States.
Amendment 222 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
Article 2 – paragraph 1 – point d – indent 1
Amendment 223 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
Article 2 – paragraph 1 – point d – indent 2
Amendment 232 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation and govern themselves under specific terms and conditions.
Amendment 240 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, rank and prioritise information in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 246 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that the provider:
Amendment 247 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
Article 4 – paragraph 1 – point a
(a) the provider does not modify the information;
Amendment 248 #
Proposal for a regulation
Article 4 – paragraph 1 – point b
Article 4 – paragraph 1 – point b
(b) the provider complies with conditions on access to the information;
Amendment 249 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
Article 4 – paragraph 1 – point c
(c) the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry;
Amendment 250 #
Proposal for a regulation
Article 4 – paragraph 1 – point d
Article 4 – paragraph 1 – point d
(d) the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and
Amendment 250 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Consequently, providers of services, such as cloud infrastructure, which are provided at the request of parties other than the content providers and only indirectly benefit the latter, should not be covered by the definition of online platforms. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 252 #
Proposal for a regulation
Article 4 – paragraph 1 – point e
Article 4 – paragraph 1 – point e
(e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.
Amendment 262 #
Proposal for a regulation
Article 5 – paragraph 4 a (new)
Article 5 – paragraph 4 a (new)
4 a. Providers of intermediary services shall be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 when they do not comply with the due diligence obligations set out in this Regulation.
Amendment 267 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Article 6 – paragraph 1 a (new)
Providers of intermediary services shall ensure that voluntary investigations or activities are accompanied with appropriate safeguards, such as human oversight, to ensure they are transparent, fair and non-discriminatory.
Amendment 268 #
Proposal for a regulation
Article 7 – title
Article 7 – title
No general monitoring or active fact- finding obligations without undermining the obligation to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk
Amendment 272 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content taking into account the potential harm the illegal content in question may create. In order to ensure a harmonised implementation of illegal content removal throughout the Union, the provider should, within 24 hours, remove or disable access to illegal content that can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety. According to the well-established case-law of the Court of Justice and in line with Directive 2000/31/EC, the concept of ‘public policy’ involves a genuine, present and sufficiently serious threat which affects one of the fundamental interest of society, in particular for the prevention, investigation, detection and prosecution of criminal offences, including the protection of minors and the fight against any incitement to hatred on grounds of race, sex, religion or nationality, and violations of human dignity concerning individual persons. The concept of ‘public security’ as interpreted by the Court of Justice covers both the internal security of a Member State, which may be affected by, inter alia, a direct threat and physical security of the population of the Member State concerned, and the external security, which may be affected by, inter alia, the risk of a serous disturbance to the foreign relations of that Member State of to the peaceful coexistence of nations. Where the illegal content does not seriously harm public policy, public security, public health or consumers’ health or safety, the provider should remove or disable access to illegal content within seven days. The deadlines referred to in this Regulation should be without prejudice to specific deadlines set out Union law or within administrative or judicial orders. The provider may derogate from the deadlines referred to in this Regulation on the grounds of force majeure or for justifiable technical or operational reasons but it should be required to inform the competent authorities as provided for in this Regulation. The removal or disabling of access should be undertaken in the observance of the principle ofthe Charter of Fundamental Rights, including a high level of consumer protection and freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 284 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
Article 11 – paragraph 5 a (new)
5 a. Providers of intermediary services that qualify as small or micro enterprises within the meaning of the Annex to Recommendation2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to stablish collective representation under the guidance of the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative.
Amendment 288 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they imposethe activities undertaken by them in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 291 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective, necessary and proportionate manner in applying and enforcing the restrictionactivities referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 317 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 323 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify and assess the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 324 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) where necessary, an explanation of the reasons why the individual or entity considers the information in question to be illegal content;
Amendment 330 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exactsuch as the URL or URLs, andor, where necessary, additional information enabling the identification of the illegal content;
Amendment 332 #
Proposal for a regulation
Article 14 – paragraph 2 – point d
Article 14 – paragraph 2 – point d
(d) a statement confirming the good faith beliefbest knowledge of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.
Amendment 345 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. Providers of hosting services shall ensure that content previously identified as illegal following the mechanisms in paragraphs 1 and 2, remain inaccessible after take down.
Amendment 360 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. In addition, recipients of intermediary services should be able to hold the legal representative liable for non-compliance.
Amendment 379 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or disable access to the information or not;
Amendment 380 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions to suspend or terminate or not the provision of the service, in whole or in part, to the recipients;
Amendment 381 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions to suspend or terminate the recipients’ account or not.
Amendment 382 #
Proposal for a regulation
Article 17 – paragraph 2
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints and include human review.
Amendment 401 #
Proposal for a regulation
Article 19 – paragraph 1 a (new)
Article 19 – paragraph 1 a (new)
1 a. Under certain cases such as cases based on existing internal systems or depending on urgencies, the regime of trusted flaggers should allow to exceptionally prioritise other notices in order to increase efficiency and involvement of all actors.
Amendment 403 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competencdemonstrated particular competence, accuracy and expertise for the purposes of detecting, identifying and notifying illegal content;
Amendment 407 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests, ensures independent public interest representation and is independent from any online platform;
Amendment 409 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent andn objective manner.
Amendment 410 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. Digital Services Coordinators shall engage in dialogue with platforms and rights holders for maintaining the accuracy and efficacy of a trusted flagger system.
Amendment 418 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestlyillegal content. A termination of the service can be issued in case the recipients fail to comply with the applicable provisions set out in this Regulation or in case the suspension has occurred at least 3 times following verification of the repeated provision of illegal content.
Amendment 426 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 429 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
Amendment 432 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
Article 20 – paragraph 3 – point d
Amendment 443 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, be it business-to-consumer or peer-to peer, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
Amendment 451 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
Article 22 – paragraph 1 – point f
Amendment 479 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
Amendment 487 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
Article 24 – paragraph 1 – point a
(a) that the information displayed is anor parts thereof is an online advertisement;
Amendment 489 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
Amendment 494 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
Article 24 – paragraph 1 – point c
(c) clear meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed.
Amendment 496 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
Article 24 – paragraph 1 – point c a (new)
(c a) whether the advertisement was selected using an automated system and, in that case, the identity of the natural or legal person responsible for the system.
Amendment 499 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
Providers of intermediary services shall inform the natural or legal person on whose behalf the advertisement is displayed where the advertisement has been displayed. They shall also inform public authorities, non-governmental organisations and researchers, upon their request.
Amendment 502 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Article 24 – paragraph 1 b (new)
Online platforms shall favour advertising that do not require any tracking of user interaction with content.
Amendment 503 #
Proposal for a regulation
Article 24 – paragraph 1 c (new)
Article 24 – paragraph 1 c (new)
Online platforms shall offer the possibility to easily opt-out for micro-targeted tracking.
Amendment 504 #
Proposal for a regulation
Article 24 – paragraph 1 d (new)
Article 24 – paragraph 1 d (new)
Online platforms shall offer the possibility to opt-in for the use of behavioural data and political advertising.
Amendment 507 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3, or with a turnover of over EUR 50 million per year.1a _________________ 1aCommission Staff Working Document. Impact Assessment Report. Annexes. (SWD(2020)348).
Amendment 514 #
Proposal for a regulation
Article 25 – paragraph 4 a (new)
Article 25 – paragraph 4 a (new)
4 a. Very large platforms shall allow business users and providers of ancillary services access to and interoperability with the same operating system, hardware or software features that are available or used in the provision by the gatekeeper of any ancillary services.
Amendment 515 #
Proposal for a regulation
Article 25 – paragraph 4 b (new)
Article 25 – paragraph 4 b (new)
4 b. Gatekeepers of very large platforms shall allow the installation and effective use of third party software applications or software application stores using, or interoperating with, operating systems of that gatekeeper and allow these software applications or software application stores to be accessed by means other than the core platform services of that gatekeeper. The gatekeeper shall not be prevented from taking proportionate measures to ensure that third party software applications or software application stores do not endanger the integrity of the hardware or operating system provided by the gatekeeper.
Amendment 516 #
Proposal for a regulation
Article 25 – paragraph 4 c (new)
Article 25 – paragraph 4 c (new)
4 c. Very large platforms shall refrain from technically restricting the ability of end users to switch between and subscribe to different software applications and services to be accessed using the operating system of the gatekeeper, including as regards the choice of Internet access provider for end users.
Amendment 517 #
Proposal for a regulation
Article 25 – paragraph 4 d (new)
Article 25 – paragraph 4 d (new)
4 d. Very large platforms shall allow consumers and developers in mobile application ecosystems to increase the number of applications available and ensure new functionalities across software applications and services to be accessed using the operating systems of the gatekeeper.
Amendment 528 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and activities and shall include the following systemic risks:
Amendment 535 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, including the rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 552 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored toeasures to cease, prevent and mitigate the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 554 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services and activities, or their terms and conditions;
Amendment 564 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to cease, prevent and mitigate the systemic risks identified.
Amendment 593 #
Proposal for a regulation
Article 29 – paragraph -1 (new)
Article 29 – paragraph -1 (new)
-1. Online platforms that use recommender systems shall indicate visibly to their recipients that the platform uses recommender systems.
Amendment 594 #
Proposal for a regulation
Article 29 – paragraph -1 a (new)
Article 29 – paragraph -1 a (new)
-1 a. Online platforms shall ensure that the option activated by default for the recipient of the service is not based on profiling within the meaning of Article 4(4) of Regulation (EU) 2016/679.
Amendment 598 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large oOnline platforms that use recommender systems shall set out in their terms and conditions, in a clearseparately the information concerning the role and functioning of recommender systems, in a clear for average users, concise, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as anyoffer controls with the available options for the recipients of the service to modifyin a user-friendly manner to modify, customize or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. basic natural criteria such as time, topics of interest, etc.
Amendment 601 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1 a. The parameters referred to in paragraph 1 shall include, at a minimum: (a) whether the recommender system is an automated system and, in that case, the identity of the natural or legal person responsible for the recommender system, if different from the platform provider; (b) clear information about the criteria used by recommender systems; (c) the relevance and weight of each criteria which leads to the information recommended; (e) what goals the relevant system has been optimised for, (d) if applicable, explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs.
Amendment 606 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including a high level of consumer protection, are effectively protected.
Amendment 618 #
Proposal for a regulation
Article 30 – paragraph 2 – point c a (new)
Article 30 – paragraph 2 – point c a (new)
(c a) data regarding the amount of spending;
Amendment 619 #
Proposal for a regulation
Article 30 – paragraph 2 – point d a (new)
Article 30 – paragraph 2 – point d a (new)
(d a) whether one or more particular groups of recipients of the service have been explicitly excluded from the advertisement target group;
Amendment 627 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide information and access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conductingfacilitating and conducting public interest research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). and to enable verification of the effectiveness and proportionality of the mitigation measures as set out in Article 27(1).
Amendment 631 #
Proposal for a regulation
Article 31 – paragraph 3 a (new)
Article 31 – paragraph 3 a (new)
3 a. Very large online platforms shall provide effective portability of data generated through the activity of a business user or end user and shall, in particular, provide tools for end users to facilitate the exercise of data portability, in line with Regulation EU 2016/679, including by the provision of continuous and real-time access;
Amendment 632 #
Proposal for a regulation
Article 31 – paragraph 3 b (new)
Article 31 – paragraph 3 b (new)
3 b. Very large online platforms shall provide business users, or third parties authorised by a business user, free of charge, with effective, high-quality, continuous and real-time access and use of aggregated or non-personal aggregated data, that is provided for or generated in the context of the use of the relevant core platform services by those business users and the end users engaging with the products or services provided by those business users; for personal data, provide access and use, in full compliance with GDPR, only where directly connected with the use effectuated by the end user in respect of the products or services offered by the relevant business user through the relevant core platform service, and when the end user opts in to such sharing with a consent in the sense of Regulation (EU) 2016/679; the functionalities for giving information and offering the opportunity to grant consent shall be as user-friendly as possible.
Amendment 633 #
Proposal for a regulation
Article 31 – paragraph 3 c (new)
Article 31 – paragraph 3 c (new)
3 c. The data provided to vetted researchers shall be as disaggregated as possible, unless the researcher requests it otherwise.
Amendment 634 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interestscivil society organisations or think tanks representing the public interest, be independent from commercial interests, disclose the funding financing the research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 640 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU) 2019/882
Amendment 681 #
Proposal for a regulation
Article 42 – paragraph 3
Article 42 – paragraph 3
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or global turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or global turnover of the provider concerned.
Amendment 684 #
Proposal for a regulation
Article 42 – paragraph 4
Article 42 – paragraph 4
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily global turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.
Amendment 704 #
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
Article 59 – paragraph 1 – introductory part
1. In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total global turnover in the preceding financial year where it finds that thate platform, intentionally or negligently:
Amendment 705 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
Article 59 – paragraph 2 – introductory part
2. The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total global turnover in the preceding financial year, where they intentionally or negligently:
Amendment 706 #
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
Article 60 – paragraph 1 – introductory part
1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily global turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to:
Amendment 765 #
Proposal for a regulation
Article 5 – paragraph 2
Article 5 – paragraph 2
2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority, decisive influence or the control of the provider.
Amendment 777 #
Proposal for a regulation
Article 5 a (new)
Article 5 a (new)
Amendment 928 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in theiruse fair, non-discriminatory and transparent contract terms and conditions. T that information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set outshall be drafted in clear and unambiguous language and shall bare publicly available in an easily accessible format in a searchable archive of all the previous versions with their date of application.
Amendment 1132 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 1145 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the provider of the online platform not to act upon the receipt of a notice or against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 1152 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions whether or not to remove or disable access to or restrict visibility of the information;
Amendment 1159 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
Amendment 1163 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions whether or not to suspend or terminate the recipients’ account.
Amendment 1200 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the bodyalways direct recipients to an out-of-court dispute settlement body. The information about the competent out-of-court body shall be easily accessible on the online interface of the online platform in a clear and an user-friendly manner.
Amendment 1205 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 2
Article 18 – paragraph 1 – subparagraph 2
Amendment 1208 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
Article 18 – paragraph 1 a (new)
1a. Online platforms shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 1243 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
Article 18 – paragraph 2 a (new)
2a. Certified out-of-court dispute settlement bodies shall draw up annual reports listing the number of complaints received annually, the outcomes of the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes.
Amendment 1362 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
Article 21 – paragraph 2 a (new)
2a. When a platform that allows consumers to conclude distance contracts with traders becomes aware that a piece of information, a product or service poses a serious risk to the life, health or safety of consumers, it shall promptly inform the competent authorities of the Member State or Member States concerned and provide all relevant information available.
Amendment 1712 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, functionable and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until onfive year after the advertisement was displayed for the last time on their online interfaces. They shall ensure multi- criterion queries can be performed per advertiser and per all data points present in the advertisement, and provide aggregated data for these queries on the amount spent, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1759 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate., and with an easily accessible and user-friendly mechanism to search for multiple criteria, such as those reported in accordance with the obligations set out in Articles 13 and 23
Amendment 1771 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1804 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33a Algorithm accountability 1. When using automated decision- making, the very large online platform shall perform an assessment of the algorithms used. 2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: (a) the compliance with corresponding Union requirements; (b) how the algorithm is used and its impact on the provision of the service; (c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and (d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). 3. When performing its assessment, the very large online platform may seek advice from relevant national public authorities, researchers and non- governmental organisations. 4. Following the assessment, referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points a), b), c) and d) of Paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. 5. Where the very large online platform finds that the algorithm used does not comply with point (a), or (d) of paragraph 2 of this Article, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. 6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c), or (d) of paragraph 2 of this Article, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred into Paragraph 5 of this Article, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement.
Amendment 1842 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
2a. The absence of such standards as defined in this article should not prevent the timely implementation of the measures outlined in this regulation.