BETA

97 Amendments of Eva MAYDELL related to 2020/0361(COD)

Amendment 81 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. The Commission and the Member States should provide guidance to on how to identify the illegal content.
2021/06/23
Committee: ITRE
Amendment 121 #
Proposal for a regulation
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The orders to act against illegal content may require providers of intermediary services to take steps, in the specific case, to remove identical or equivalent illegal content, within the same context. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
2021/06/23
Committee: ITRE
Amendment 140 #
Proposal for a regulation
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation and the competitiveness of European companies should not be hampered but instead be stimulated.
2021/09/10
Committee: ECON
Amendment 142 #
Proposal for a regulation
Recital 43 a (new)
(43 a) To similarly avoid unnecessary regulatory burden, certain obligations should not apply to online platforms offering products and services from third- party traders, which are established in the European Union, where these traders' access is exclusive, curated and entirely controlled by the providers of the online platform and these traders’ products and services are reviewed and pre-approved by the providers of the online platform before they are offered on the platform. These online platforms are often referred to as closed online platforms. As the products and services offered are reviewed and pre-approved by the online platforms, the prevalence of illegal content and products on these platforms is low, and these platforms cannot benefit from relevant liability exemptions outlined in this Regulation. These online platforms should subsequently not be subjected to the obligations which are necessary for platforms with different operational models where the prevalence of illegal content is more frequent and the relevant liability exemptions are available.
2021/06/23
Committee: ITRE
Amendment 151 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms, having received guidance from public authorities on how to identify illegal content, take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/06/23
Committee: ITRE
Amendment 157 #
Proposal for a regulation
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders on the platforms should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
2021/06/23
Committee: ITRE
Amendment 161 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot, as this would be disproportionate. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties or be liable for this information in case it proves to be inaccurate. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/06/23
Committee: ITRE
Amendment 161 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers in connection with information relating to illegal content, products, services and activities. In particular, thatThe illegal nature of such content, products or services is defined by relevant Union law or national law in accordance with Union law. The concept should be understood, for example, to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/09/10
Committee: ECON
Amendment 174 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation, without prejudice to Article 6, in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/09/10
Committee: ECON
Amendment 175 #
Proposal for a regulation
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. In certain cases, online platforms whose number of recipients does not exceed the operational threshold set at 10% of the Union population should also be considered very large online platforms due to their role in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online.
2021/06/23
Committee: ITRE
Amendment 182 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content, or against content that violates the community rules and guidelines of the intermediary services, that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/09/10
Committee: ECON
Amendment 185 #
Proposal for a regulation
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. A disclaimer of an opinion should be given where the auditor does not have enough information to conclude on an opinion due to the novelty of the issues audited.
2021/06/23
Committee: ITRE
Amendment 186 #
Proposal for a regulation
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services, cloud services or search engines. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent thatwhere they qualify as ‘mere conduit’, ‘caching’ or hosting service.
2021/09/10
Committee: ECON
Amendment 193 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.
2021/09/10
Committee: ECON
Amendment 216 #
Proposal for a regulation
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. In this regard, the Commission and Digital Service Coordinators may work together on information and guidelines for the voluntary implementation of the provisions in this Regulation for micro or small enterprises. Furthermore, the Commission and Digital Services Coordinators are also encouraged to do so for medium enterprises, which while not benefitting from the liability exemptions in Section 3, may sometimes lack the legal resources necessary to ensure proper understanding and compliance with all provisions. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/09/10
Committee: ECON
Amendment 220 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. FSuch entities can also include businesses who have a vested interest in flagging counterfeit products of their brand thus ensuring the online consumer experience is safer and more reliable. Similarly, for intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/09/10
Committee: ECON
Amendment 225 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commitcontent manifestly related to a serious criminal offence involving a threat to the life or safety of persons, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcemrelevant competent authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/09/10
Committee: ECON
Amendment 230 #
Proposal for a regulation
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders on the platforms should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
2021/09/10
Committee: ECON
Amendment 232 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of some of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System.45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, tThe online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties or be liable for this information in case it proves to be inaccurate. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/09/10
Committee: ECON
Amendment 240 #
Proposal for a regulation
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegatedislative acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. Provisions should also exist for Member States to request for the Commission to assess if an online platform that does not meet the threshold of 45 million active monthly users may still cause significant and systemic societal risks. While an online platform may not meet the quantitative criteria to be designated as a very large online platform, it may meet qualitative criteria. In such cases, the Digital Services Coordinator of establishment may require the online platform to fulfil part of the obligations set out in Section 4 for a limited number of time until the risk has abated.
2021/09/10
Committee: ECON
Amendment 253 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. Such reinforcement could include the expansion and resource allocation to content moderation in languages other than English. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/09/10
Committee: ECON
Amendment 258 #
Proposal for a regulation
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform systematically does not comply with this Regulation or the commitments undertaken. A disclaimer of an opinion should be given where the auditor does not have enough information to conclude on an opinion due to the novelty of the issues audited.
2021/09/10
Committee: ECON
Amendment 263 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks depending on the category of the advertisement and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements related to public health, public security, civil discourse, political participation and equality. The repositories of the advertisements related to these categories should be displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements in these specific categories and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
2021/09/10
Committee: ECON
Amendment 301 #
Proposal for a regulation
Recital 100
(100) Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for systemic non-compliance with the relevant obligations and breach of the procedural rules, subject to appropriate limitation periods. A systematic infringement is a pattern of online harm that, when the individual harms are added up, constitutes an aggregation of systemic harm to active recipients of the service across three or more EU Member States.
2021/09/10
Committee: ECON
Amendment 319 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices at scale and exclusively by electronic means.
2021/06/24
Committee: ITRE
Amendment 327 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, includingthrough the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
2021/09/10
Committee: ECON
Amendment 328 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of anotherr functionality of another service or the principle service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.
2021/09/10
Committee: ECON
Amendment 329 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic loidentification of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
2021/06/24
Committee: ITRE
Amendment 344 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6 a. Where a provider of hosting services processes a notice and decides to remove or disable access to specific items of information provided by the recipients of the service, it shall take steps, in the specific case, to remove identical or equivalent illegal content, within the same context.
2021/06/24
Committee: ITRE
Amendment 346 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform prese. It is importants the specific item of information or otherwise enables the specific transaction at issueat hosting services adopt the highest standards of transparency to highlight, in a way that would lead an average and reasonably well-informed consumer to believeunderstand, that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control comes from a third party which is not offered by the hosting service.
2021/09/10
Committee: ECON
Amendment 347 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures for the implementation of community rules and guidelines of their services, or to comply with the requirements of Union law, including those set out in this Regulation, or national law in accordance with Union law.
2021/09/10
Committee: ECON
Amendment 354 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
- the order is transmitted via secure channels established between the relevant national judicial or administrative authorities and the providers of intermediary services;
2021/09/10
Committee: ECON
Amendment 356 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1 (new)
In extraordinary cases, where the intermediary service has reasonable doubt that the removal order is not legally sound, the intermediary service should have access to a mechanism to challenge the decision. This mechanism shall be established by the Digital Services Coordinators in coordination with the Board and the Commission.
2021/09/10
Committee: ECON
Amendment 358 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
- precise identification elements of the recipients of the service concerned;
2021/09/10
Committee: ECON
Amendment 359 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 a (new)
- the order is transmitted via secure channels established between the relevant national judicial or administrative authorities and the providers of intermediary services.
2021/09/10
Committee: ECON
Amendment 360 #
Proposal for a regulation
Article 15 – paragraph 4 b (new)
4 b. Providers of hosting services shall not be obliged to provide a statement of reasons referred to in paragraph 1 where doing so would infringe a legal obligation or where the statement of reasons could cause unintended safety concerns for the reporting party. In addition, providers of hosting services shall not be obliged to provide a statement of reasons referred to in paragraph 1 where the provider can demonstrate that the recipient of the service has repeatedly provided illegal content
2021/06/24
Committee: ITRE
Amendment 361 #
Proposal for a regulation
Article 10 – paragraph 2
2. Providers of intermediary services shall make public to trusted flaggers as well as users in all Member States the information necessary to easily identify and communicate with their intermediary services' single points of contact.
2021/09/10
Committee: ECON
Amendment 362 #
Proposal for a regulation
Article 15 a (new)
Article 15 a Protection against repeated misuse and criminal offences 1. Providers of intermediary services shall, after having issued a prior warning, suspend or in appropriate circumstances terminate the provision of their services to recipients of the service that frequently provide illegal content. 2. Where a provider of intermediary service becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. Where the provider of intermediary service cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it has its main establishment or has its legal representative and also transmit this information to Europol for appropriate follow-up.
2021/06/24
Committee: ITRE
Amendment 373 #
Proposal for a regulation
Article 16 – paragraph 1 b (new)
This Section shall not apply to online platforms offering products and services from third-party traders, which are established in the European Union, where these traders' access is exclusive, curated and entirely controlled by the providers of the online platform and these traders’ products and services are reviewed and pre-approved by the providers of the online platform before they are offered on the platform.
2021/06/24
Committee: ITRE
Amendment 378 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c) the content moderation engaged in athrough the providers’'s voluntary own- initiative investigations as per Article 6, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures;
2021/09/10
Committee: ECON
Amendment 394 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) where possible, a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
2021/09/10
Committee: ECON
Amendment 395 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
(c) where possible, the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;
2021/09/10
Committee: ECON
Amendment 399 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 solely in respect of the specific item of information concerned, when the provider of hosting services can unequivocally identify the illegal nature of the content.
2021/09/10
Committee: ECON
Amendment 409 #
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1 (new)
Where a provider of hosting services decides to not remove or disable access to specific items of information provided by the recipients of the service, detected through the mechanisms established in Article 14, it shall inform the user who notified the online platform of the content and where needed, the recipient of the decision without undue delay. The notification of such a decision can be done through automated means.
2021/09/10
Committee: ECON
Amendment 411 #
Proposal for a regulation
Article 15 – paragraph 4
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal data.deleted
2021/09/10
Committee: ECON
Amendment 415 #
Proposal for a regulation
Article 16 – paragraph 1 – subparagraph 1 (new)
The Commission and Digital Service Coordinators may work together on information and guidelines for the voluntary implementation of the provisions in this Regulation for micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/09/10
Committee: ECON
Amendment 416 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide to all recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge,. Complaints can be filed against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/09/10
Committee: ECON
Amendment 420 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable or restrict access to the information;
2021/09/10
Committee: ECON
Amendment 421 #
Proposal for a regulation
Article 17 – paragraph 1 – subparagraph 1 (new)
Complaints can also be lodged against decisions made by the online platform to not remove, not disable, not suspend and not terminate access to accounts.
2021/09/10
Committee: ECON
Amendment 422 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, or in appropriate circumstances terminate, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
2021/06/24
Committee: ITRE
Amendment 426 #
Proposal for a regulation
Article 17 – paragraph 3 – point a (new)
(a) Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is indeed illegal and is incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does warrant the suspension or termination of the service or the account, it shall also reserve its decision referred to in Paragraph 1 without undue delay.
2021/09/10
Committee: ECON
Amendment 434 #
Proposal for a regulation
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension, and the circumstances in which they will terminate their services.
2021/06/24
Committee: ITRE
Amendment 437 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is has its main establishedment or has its legal representative or inform Europoland also transmit this information to Europol for appropriate follow-up.
2021/06/24
Committee: ITRE
Amendment 438 #
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Commission or by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:
2021/09/10
Committee: ECON
Amendment 439 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform except in the cases of businesses with a vested interest in flagging counterfeit products of their brand thus ensuring the online consumer experience is safer and more reliable;
2021/09/10
Committee: ECON
Amendment 442 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with tradersprofessional traders on the platform, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained from the trader the following information:
2021/06/24
Committee: ITRE
Amendment 443 #
Proposal for a regulation
Article 19 – paragraph 3
3. Digital Services Coordinators and the Commission shall communicate to the Commissioneach other and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2.
2021/09/10
Committee: ECON
Amendment 445 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinatorauthority that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
2021/09/10
Committee: ECON
Amendment 447 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinatorauthority that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
2021/09/10
Committee: ECON
Amendment 448 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bank account details of the trader, where the trader is a natural person;deleted
2021/06/24
Committee: ITRE
Amendment 459 #
Proposal for a regulation
Article 21 – paragraph 2 – introductory part
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europolhas its main establishment or its legal representative and also transmit the information to Europol for appropriate follow up.
2021/09/10
Committee: ECON
Amendment 462 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders on the platform, it shall ensure that traders can only use its services to promote messages on or to offer products or, services or content to consumers located in the Union if, prior to the use of its services, the online platform has obtaintrader has provided the following information to the online platform:
2021/09/10
Committee: ECON
Amendment 464 #
Proposal for a regulation
Article 22 – paragraph 1 – point b
(b) a passport or a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council50 ; _________________ 50 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC
2021/09/10
Committee: ECON
Amendment 466 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
(d) to the extent the contract relates to products that are subject to the Union Regulations listed in Article 4(5) of Regulation (EU) 2019/1020 of the European Parliament and the Council, the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and established in the Union, referred to in Article 4(1) of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law; _________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
2021/09/10
Committee: ECON
Amendment 468 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 (new)
Online platforms that facilitate the sale of harmonised consumer goods between a seller in a third country and a consumer in the EU and where there is no other manufacturer or importer in the EU, should verify that the product bears the required conformity mark (CE mark) and that it has other relevant documents (e.g. EU declaration of conformity). Traders from within the Union and from third countries should also have the option to voluntarily upload the relevant documents certifying that their goods meet the consumer protection standards of the EU. If the traders choose to do so, online platforms may then show proof of these documents to users as part of the user interface to instil more consumer confidence in the distance contracts conducted on their platforms.
2021/09/10
Committee: ECON
Amendment 470 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. Provided that the online platform has made reasonable efforts to assess the information in points (a), (d) and (e), online platform shall not be held liable for information provided by the trader that ends up being inaccurate.
2021/09/10
Committee: ECON
Amendment 471 #
Proposal for a regulation
Article 22 – paragraph 3 – introductory part
3. Where the online platform obtains indications, through its reasonable efforts under paragraph 2 or through Member States' consumer authorities, that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/09/10
Committee: ECON
Amendment 483 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c) where relevant, meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed for all advertisements related to public health, public security, civil discourse, political participation and equality.
2021/09/10
Committee: ECON
Amendment 489 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
2021/09/10
Committee: ECON
Amendment 490 #
Proposal for a regulation
Article 25 – paragraph 2
2. The Commission shall adopt delegated acts in accordance with Article 69ould be able to update this Regulation through legislative acts in accordance with Article 294 of TFEU. Such revisions may be necessary to adjust the number of average monthly recipients of the service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least with 5 % in relation to its population in 2020 or, after adjustment by means of a delegatedislative act, of its population in the year in which the latest delegatedislative act was adopted. In that case, it shall adjust the number so that it corresponds to 10% of the Union’s population in the year in which it adopts the delegatedislative act, rounded up or down to allow the number to be expressed in millions.
2021/09/10
Committee: ECON
Amendment 491 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 1 (new)
Member States may request for the Commission to assess if an online platform that does not meet the threshold of 45 million active monthly users set out in Paragraph 1 may still cause significant and systemic societal risks. While an online platform may not meet the quantitative criteria to be categorised as a Very Large Online Platform, it may meet at least two of the following qualitative criteria: (a) it has a significant impact on the internal market; (b) it operates a core platform service which serves as an important gateway for business users to reach end users; (c) it enjoys an entrenched and durable position in its operations or it is foreseeable that it will enjoy such a position in the near future; (d) it repeatedly and systemically fails to take down illegal content, as evidenced in its transparency reporting as per Articles 13 and 24. If the Commission finds that the online platform does pose significant and systemic societal risks based on the above criteria, the Digital Services Coordinator of establishment may require the online platform to fulfil part of the obligations set out in Section 4 for a limited number of times until the risk has abated.
2021/09/10
Committee: ECON
Amendment 492 #
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features.
2021/09/10
Committee: ECON
Amendment 512 #
Proposal for a regulation
Article 25 – paragraph 4 – subparagraph 1
4. The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipients of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1, or whether the operating model and nature of platform constitutes a systemic risk. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission.
2021/06/24
Committee: ITRE
Amendment 515 #
Proposal for a regulation
Article 27 – paragraph 2 – subparagraph 1 (new)
(c) measures taken by the Digital Service Coordinators, the Board and the Commission to ensure that highly sensitive information and business secrets are kept confidential.
2021/09/10
Committee: ECON
Amendment 520 #
Proposal for a regulation
Article 28 – paragraph 1 – subparagraph 1 (new)
Digital Services Coordinators shall provide very large online platforms under their jurisdiction with an annual audit plan outlining the key areas of focus for the upcoming audit cycle.
2021/09/10
Committee: ECON
Amendment 521 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned and have not provided any other service to the platform in the previous 12 months;
2021/09/10
Committee: ECON
Amendment 523 #
Proposal for a regulation
Article 28 – paragraph 2 – point c
(c) have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.;
2021/09/10
Committee: ECON
Amendment 524 #
Proposal for a regulation
Article 28 – paragraph 2 – subparagraph 1 (new)
(d) have not provided an audit to the same very large online platform for more than three consecutive years.
2021/09/10
Committee: ECON
Amendment 525 #
Proposal for a regulation
Article 28 – paragraph 3 – point f
(f) where the audit opinion is not posiegative, operational recommendations on specific measures to achieve compliance. and risk- based remediation timelines with a focus on rectifying issues that have the potential to cause most harm to users of the service as a priority;
2021/09/10
Committee: ECON
Amendment 526 #
Proposal for a regulation
Article 28 – paragraph 3 – subparagraph 1 (new)
(g) where the organisations that perform the audits do not have enough information to conclude an opinion due to the novelty of the issues audited, a disclaimer shall be given.
2021/09/10
Committee: ECON
Amendment 534 #
Proposal for a regulation
Article 30 – title
Additional online advertising transparency for advertisements related to public welfare
2021/09/10
Committee: ECON
Amendment 536 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2 for all advertisements related to public health, public security, civil discourse, political participation and equality, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/09/10
Committee: ECON
Amendment 543 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. The delegated acts should also lay out the technical conditions needed to ensure confidentiality and security of information by the vetted researchers once they acquire access to the data, including guidelines for academics who wish to publish findings based on the confidential data acquired. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/09/10
Committee: ECON
Amendment 560 #
Proposal for a regulation
Article 39 – paragraph 1 – subparagraph 1 (new)
Member States shall designate the status of Digital Services Coordinator based on the following criteria: (a) the authority has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; (b) it represents collective interests and is independent from any online platform; (c) it has the capacity to carry out its activities in a timely, diligent and objective manner.
2021/09/10
Committee: ECON
Amendment 574 #
Proposal for a regulation
Article 44 – paragraph 1
1. Digital Services Coordinators shall draw up an annual reports on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission and to the Board.
2021/09/10
Committee: ECON
Amendment 575 #
Proposal for a regulation
Article 44 – paragraph 2 – point b a (new)
(ba) measures taken by the Digital Service Coordinators to ensure that highly sensitive information and business secrets are kept confidential;
2021/09/10
Committee: ECON
Amendment 576 #
Proposal for a regulation
Article 44 – paragraph 2 – point b b (new)
(bb) an assessment of the interpretation of the Country of Origin principle in the supervisory and enforcement activities of the Digital Services Coordinators, especially in regards to Article 45 of this Regulation.
2021/09/10
Committee: ECON
Amendment 582 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned; and have not provided any other service to the platform in the previous 12 months.
2021/06/24
Committee: ITRE
Amendment 584 #
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
(c a) have not audited the same very large online platform for more than 3 consecutive years.
2021/06/24
Committee: ITRE
Amendment 587 #
Proposal for a regulation
Article 28 – paragraph 3 – point f
(f) where the audit opinion is not positive, operationalegative, recommendations on specific measures to achieve compliance and risk-based remediation timelines with a focus on rectifying issues that have the potential to cause most harm to users of the service as a priority.
2021/06/24
Committee: ITRE
Amendment 588 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
(f a) where the organisations that perform the audits do not have enough information to conclude an opinion due to the novelty of the issues audited, a relevant disclaimer.
2021/06/24
Committee: ITRE
Amendment 591 #
Proposal for a regulation
Article 28 – paragraph 4 a (new)
4 a. Digital Services Coordinators shall provide very large online platforms under their jurisdiction with an annual audit plan outlining the key areas of focus for the upcoming audit cycle.
2021/06/24
Committee: ITRE
Amendment 607 #
Proposal for a regulation
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms, without prejudice to Directive (EU) 2016/943 on trade secrets.
2021/09/10
Committee: ECON
Amendment 648 #
Proposal for a regulation
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
2021/09/10
Committee: ECON
Amendment 649 #
Proposal for a regulation
Article 69 – paragraph 3
3. The delegation of power referred to in Articles 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
2021/09/10
Committee: ECON
Amendment 650 #
Proposal for a regulation
Article 69 – paragraph 5
5. A delegated act adopted pursuant to Articles 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.
2021/09/10
Committee: ECON
Amendment 655 #
Proposal for a regulation
Article 74 – paragraph 2 – introductory part
2. It shall apply from [date - threnine months after its entry into force].
2021/09/10
Committee: ECON