98 Amendments of Frances FITZGERALD related to 2020/0361(COD)
Amendment 28 #
Proposal for a regulation
Recital 2
Recital 2
(2) Up till now, politics has relied on voluntary cooperation with a view to address these risks and challenges. Since this has proved insufficient and there has been a lack of harmonised rules at Union level, Member States arehave been increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross- border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice.
Amendment 30 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. the gender equality principle and non- discrimination. In order to exercise these rights, the online world needs to be a safe space, especially for women and girls, where everybody can move freely. Therefore, measures to protect from, and prevent, phenomena such as online violence, cyberstalking, harassment, hate speech and exploitation of women and girls are essential.
Amendment 34 #
Proposal for a regulation
Recital 5
Recital 5
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. Given that online platforms are part of our everyday-life and have become indispensable, even more so since the pandemic, the spread of illegal and harmful content, such as child sexual abuse material, online sexual harassment, unlawful non-consensual sharing of private images and videos, cyber violence, has risen dramatically as well. Ensuring a safe space online implies targeted actions against all phenomena harmfully affecting our social life, including through an awaited proposal on how to deal with harmful but not illegal content online. _________________ 26Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
Amendment 35 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/..2021/784 of the European Parliament and of the Council29 – proposed Terand the recently adopted Regulation of the European Parliament and of the Council on a temporary derogation from certain prorvist Content Online Regulationions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number- independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/..2021/784 of the European Parliament and of the Council – proposed Tof 29 April 2021 on addressing the dissemination of terrorist Ccontent Oonline Regulation(OJ L 172, 17.5.2021, p. 79).
Amendment 37 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly anin order to underpin the general idea that what is illegal offline should also be illegal online. The concept should also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech, child sexual abuse material or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as trafficking in human beings, sexual exploitation of women and girls, the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images and videos, online stalking, grooming adolescents, online sexual harassment and other forms of gender based violence, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 47 #
Proposal for a regulation
Recital 26 a (new)
Recital 26 a (new)
(26 a) Being aware that the intermediary services have already applied a risk assessment, there is still potential for improvement for the security and safety of all users, especially children, women, and other vulnerable groups. Therefore providers of intermediary services, more precisely online platforms and very large online platforms, shall regularly evaluate their risk assessment and, if found necessary, improve it. Given the importance of providers of intermediary services and their potential to impact social life, common rules determining how users shall behave online, should be applied.The implementation of a code of conduct should be obligatory for every provider of intermediary services covered by this Regulation.
Amendment 48 #
Proposal for a regulation
Recital 30
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679, the recently adopted Regulation of the European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. Member States should ensure that the competent authorities fulfil their tasks in an objective, independent and non-discriminatory manner. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content2021/784 addressing the dissemination of terrorist content online, the recently adopted Regulation of the European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
Amendment 52 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and, women and girls, as well as vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 53 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. Providers offering their services in more than one Member State should provide a breakdown of the information by Member State. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40Aligned with the annual reports broken down by actions of content moderation and Member State, the results of all forms of violence against women and girls online, hate speech and of other illegal content should reappear in the crime statistics. All forms of violence against women and girls shall be reported as an own category in those criminal statistics and law enforcement entities shall list them separately. _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 57 #
Proposal for a regulation
Recital 41
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the gender equality principle and the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
Amendment 63 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising reproducing stereotypical content with an impact on the equal treatment and opportunities of citizens against the gender equality principle. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 66 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material, unlawful non-consensual sharing of private images and videos, online stalking or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the gender equality principle with the right to non-discrimination and the rights of the child. The social dimension, as online platforms play a major role in our everyday-life, is also affected by phenomena as online harassment and cyber violence. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform, including when algorithms are misinformed causing widening of gender gaps, or the misuse of their service through the submission of abusive notices or other methods for silencing speech, causing harm, such as long term mental health damage, psychological damage and societal damage, or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 77 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should be obliged to regularly review their algorithms to minimise such negative consequences and should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoyhave alternative options for the main parameters, including options that are not based on profiling of the recipienta visible, user-friendly and readily available option to turn off algorithmic selection with the recommender system entirely and options that are not based on profiling of the recipient. The gender- based algorithm bias must be prevented to avoid discriminatory impact on women and girls.
Amendment 84 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as gender equality, including equality between women and men, and non- discrimination and non- discrimination, eradicating all forms of violence against women and girls, including online violence, harassment and sexual exploitation, online stalking, child abuse, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 89 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including equality, are effectively protected.
Amendment 90 #
Proposal for a regulation
Article 1 – paragraph 5 – point d
Article 1 – paragraph 5 – point d
(d) Regulation (EU) …/…. on prevent2021/784 on addressing the dissemination of terrorist content online [TCO once adopted];
Amendment 91 #
Proposal for a regulation
Article 1 – paragraph 5 – point d a (new)
Article 1 – paragraph 5 – point d a (new)
(d a) Regulation of the European Parliament and of the European Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online
Amendment 93 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is manifestly not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; Reporting or warning of an illegal act shall not be deemed illegal content;
Amendment 98 #
Proposal for a regulation
Article 12 – title
Article 12 – title
Terms and conditions, code of conduct
Amendment 101 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, non-discriminatory and transparent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 103 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2 a. Providers of intermediary services shall be obliged to include on their platforms a code of conduct, setting out behavioral rules for their users. These rules shall be publicly accessible in an easy format and shall be set out in clear and unambiguous language.
Amendment 107 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include breakdowns at Member States level and, in particular, information on the following, as applicable:
Amendment 112 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
Article 13 – paragraph 1 a (new)
1 a. Protection of the identity of the victims concerned shall be ensured, in line with GDPR standards;
Amendment 135 #
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, gender equality, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 138 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content or the content that risks an increase in online violence and of information that is incompatible with their terms and conditions.
Amendment 140 #
Proposal for a regulation
Recital 4
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation and the competitiveness of European companies should not be hampered but instead be stimulated.
Amendment 148 #
(b) targeted measures aimed at limiting the display of advertisements or harmful content in association with the service they provide;
Amendment 161 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers in connection with information relating to illegal content, products, services and activities. In particular, thatThe illegal nature of such content, products or services is defined by relevant Union law or national law in accordance with Union law. The concept should be understood, for example, to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 174 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation, without prejudice to Article 6, in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 182 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content, or against content that violates the community rules and guidelines of the intermediary services, that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 186 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services, cloud services or search engines. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent thatwhere they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 193 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.
Amendment 216 #
Proposal for a regulation
Recital 43
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. In this regard, the Commission and Digital Service Coordinators may work together on information and guidelines for the voluntary implementation of the provisions in this Regulation for micro or small enterprises. Furthermore, the Commission and Digital Services Coordinators are also encouraged to do so for medium enterprises, which while not benefitting from the liability exemptions in Section 3, may sometimes lack the legal resources necessary to ensure proper understanding and compliance with all provisions. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 220 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. FSuch entities can also include businesses who have a vested interest in flagging counterfeit products of their brand thus ensuring the online consumer experience is safer and more reliable. Similarly, for intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 225 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commitcontent manifestly related to a serious criminal offence involving a threat to the life or safety of persons, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcemrelevant competent authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 230 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders on the platforms should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 232 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of some of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System.45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, tThe online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties or be liable for this information in case it proves to be inaccurate. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 240 #
Proposal for a regulation
Recital 54
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegatedislative acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. Provisions should also exist for Member States to request for the Commission to assess if an online platform that does not meet the threshold of 45 million active monthly users may still cause significant and systemic societal risks. While an online platform may not meet the quantitative criteria to be designated as a very large online platform, it may meet qualitative criteria. In such cases, the Digital Services Coordinator of establishment may require the online platform to fulfil part of the obligations set out in Section 4 for a limited number of time until the risk has abated.
Amendment 253 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. Such reinforcement could include the expansion and resource allocation to content moderation in languages other than English. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 258 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform systematically does not comply with this Regulation or the commitments undertaken. A disclaimer of an opinion should be given where the auditor does not have enough information to conclude on an opinion due to the novelty of the issues audited.
Amendment 263 #
Proposal for a regulation
Recital 63
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks depending on the category of the advertisement and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements related to public health, public security, civil discourse, political participation and equality. The repositories of the advertisements related to these categories should be displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements in these specific categories and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
Amendment 301 #
Proposal for a regulation
Recital 100
Recital 100
(100) Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for systemic non-compliance with the relevant obligations and breach of the procedural rules, subject to appropriate limitation periods. A systematic infringement is a pattern of online harm that, when the individual harms are added up, constitutes an aggregation of systemic harm to active recipients of the service across three or more EU Member States.
Amendment 327 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, includingthrough the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 328 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of anotherr functionality of another service or the principle service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.
Amendment 346 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform prese. It is importants the specific item of information or otherwise enables the specific transaction at issueat hosting services adopt the highest standards of transparency to highlight, in a way that would lead an average and reasonably well-informed consumer to believeunderstand, that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control comes from a third party which is not offered by the hosting service.
Amendment 347 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures for the implementation of community rules and guidelines of their services, or to comply with the requirements of Union law, including those set out in this Regulation, or national law in accordance with Union law.
Amendment 354 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
Article 8 – paragraph 2 – point a – indent 3 a (new)
- the order is transmitted via secure channels established between the relevant national judicial or administrative authorities and the providers of intermediary services;
Amendment 356 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1 (new)
Article 8 – paragraph 2 – subparagraph 1 (new)
In extraordinary cases, where the intermediary service has reasonable doubt that the removal order is not legally sound, the intermediary service should have access to a mechanism to challenge the decision. This mechanism shall be established by the Digital Services Coordinators in coordination with the Board and the Commission.
Amendment 358 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
Article 9 – paragraph 2 – point a – indent 1 a (new)
- precise identification elements of the recipients of the service concerned;
Amendment 359 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 a (new)
Article 9 – paragraph 2 – point a – indent 2 a (new)
- the order is transmitted via secure channels established between the relevant national judicial or administrative authorities and the providers of intermediary services.
Amendment 361 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. Providers of intermediary services shall make public to trusted flaggers as well as users in all Member States the information necessary to easily identify and communicate with their intermediary services' single points of contact.
Amendment 378 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
Article 13 – paragraph 1 – point c
(c) the content moderation engaged in athrough the providers’'s voluntary own- initiative investigations as per Article 6, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures;
Amendment 394 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) where possible, a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 395 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
Article 14 – paragraph 2 – point c
(c) where possible, the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;
Amendment 399 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 solely in respect of the specific item of information concerned, when the provider of hosting services can unequivocally identify the illegal nature of the content.
Amendment 409 #
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1 (new)
Article 15 – paragraph 2 – subparagraph 1 (new)
Where a provider of hosting services decides to not remove or disable access to specific items of information provided by the recipients of the service, detected through the mechanisms established in Article 14, it shall inform the user who notified the online platform of the content and where needed, the recipient of the decision without undue delay. The notification of such a decision can be done through automated means.
Amendment 411 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
Amendment 415 #
Proposal for a regulation
Article 16 – paragraph 1 – subparagraph 1 (new)
Article 16 – paragraph 1 – subparagraph 1 (new)
The Commission and Digital Service Coordinators may work together on information and guidelines for the voluntary implementation of the provisions in this Regulation for micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 416 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide to all recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge,. Complaints can be filed against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 420 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable or restrict access to the information;
Amendment 421 #
Proposal for a regulation
Article 17 – paragraph 1 – subparagraph 1 (new)
Article 17 – paragraph 1 – subparagraph 1 (new)
Complaints can also be lodged against decisions made by the online platform to not remove, not disable, not suspend and not terminate access to accounts.
Amendment 426 #
Proposal for a regulation
Article 17 – paragraph 3 – point a (new)
Article 17 – paragraph 3 – point a (new)
(a) Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is indeed illegal and is incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does warrant the suspension or termination of the service or the account, it shall also reserve its decision referred to in Paragraph 1 without undue delay.
Amendment 438 #
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
Article 19 – paragraph 2 – introductory part
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Commission or by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:
Amendment 439 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform except in the cases of businesses with a vested interest in flagging counterfeit products of their brand thus ensuring the online consumer experience is safer and more reliable;
Amendment 443 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators and the Commission shall communicate to the Commissioneach other and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2.
Amendment 445 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinatorauthority that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 447 #
Proposal for a regulation
Article 19 – paragraph 6
Article 19 – paragraph 6
6. The Digital Services Coordinatorauthority that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
Amendment 459 #
Proposal for a regulation
Article 21 – paragraph 2 – introductory part
Article 21 – paragraph 2 – introductory part
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europolhas its main establishment or its legal representative and also transmit the information to Europol for appropriate follow up.
Amendment 462 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders on the platform, it shall ensure that traders can only use its services to promote messages on or to offer products or, services or content to consumers located in the Union if, prior to the use of its services, the online platform has obtaintrader has provided the following information to the online platform:
Amendment 464 #
Proposal for a regulation
Article 22 – paragraph 1 – point b
Article 22 – paragraph 1 – point b
(b) a passport or a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council50 ; _________________ 50 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC
Amendment 466 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
Article 22 – paragraph 1 – point d
(d) to the extent the contract relates to products that are subject to the Union Regulations listed in Article 4(5) of Regulation (EU) 2019/1020 of the European Parliament and the Council, the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and established in the Union, referred to in Article 4(1) of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law; _________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
Amendment 468 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 (new)
Article 22 – paragraph 1 – subparagraph 1 (new)
Online platforms that facilitate the sale of harmonised consumer goods between a seller in a third country and a consumer in the EU and where there is no other manufacturer or importer in the EU, should verify that the product bears the required conformity mark (CE mark) and that it has other relevant documents (e.g. EU declaration of conformity). Traders from within the Union and from third countries should also have the option to voluntarily upload the relevant documents certifying that their goods meet the consumer protection standards of the EU. If the traders choose to do so, online platforms may then show proof of these documents to users as part of the user interface to instil more consumer confidence in the distance contracts conducted on their platforms.
Amendment 470 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. Provided that the online platform has made reasonable efforts to assess the information in points (a), (d) and (e), online platform shall not be held liable for information provided by the trader that ends up being inaccurate.
Amendment 471 #
Proposal for a regulation
Article 22 – paragraph 3 – introductory part
Article 22 – paragraph 3 – introductory part
3. Where the online platform obtains indications, through its reasonable efforts under paragraph 2 or through Member States' consumer authorities, that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
Amendment 483 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
Article 24 – paragraph 1 – point c
(c) where relevant, meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed for all advertisements related to public health, public security, civil discourse, political participation and equality.
Amendment 489 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
Amendment 490 #
Proposal for a regulation
Article 25 – paragraph 2
Article 25 – paragraph 2
2. The Commission shall adopt delegated acts in accordance with Article 69ould be able to update this Regulation through legislative acts in accordance with Article 294 of TFEU. Such revisions may be necessary to adjust the number of average monthly recipients of the service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least with 5 % in relation to its population in 2020 or, after adjustment by means of a delegatedislative act, of its population in the year in which the latest delegatedislative act was adopted. In that case, it shall adjust the number so that it corresponds to 10% of the Union’s population in the year in which it adopts the delegatedislative act, rounded up or down to allow the number to be expressed in millions.
Amendment 491 #
Proposal for a regulation
Article 25 – paragraph 2 – subparagraph 1 (new)
Article 25 – paragraph 2 – subparagraph 1 (new)
Member States may request for the Commission to assess if an online platform that does not meet the threshold of 45 million active monthly users set out in Paragraph 1 may still cause significant and systemic societal risks. While an online platform may not meet the quantitative criteria to be categorised as a Very Large Online Platform, it may meet at least two of the following qualitative criteria: (a) it has a significant impact on the internal market; (b) it operates a core platform service which serves as an important gateway for business users to reach end users; (c) it enjoys an entrenched and durable position in its operations or it is foreseeable that it will enjoy such a position in the near future; (d) it repeatedly and systemically fails to take down illegal content, as evidenced in its transparency reporting as per Articles 13 and 24. If the Commission finds that the online platform does pose significant and systemic societal risks based on the above criteria, the Digital Services Coordinator of establishment may require the online platform to fulfil part of the obligations set out in Section 4 for a limited number of times until the risk has abated.
Amendment 492 #
Amendment 515 #
Proposal for a regulation
Article 27 – paragraph 2 – subparagraph 1 (new)
Article 27 – paragraph 2 – subparagraph 1 (new)
(c) measures taken by the Digital Service Coordinators, the Board and the Commission to ensure that highly sensitive information and business secrets are kept confidential.
Amendment 520 #
Proposal for a regulation
Article 28 – paragraph 1 – subparagraph 1 (new)
Article 28 – paragraph 1 – subparagraph 1 (new)
Digital Services Coordinators shall provide very large online platforms under their jurisdiction with an annual audit plan outlining the key areas of focus for the upcoming audit cycle.
Amendment 521 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned and have not provided any other service to the platform in the previous 12 months;
Amendment 523 #
Proposal for a regulation
Article 28 – paragraph 2 – point c
Article 28 – paragraph 2 – point c
(c) have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.;
Amendment 524 #
Proposal for a regulation
Article 28 – paragraph 2 – subparagraph 1 (new)
Article 28 – paragraph 2 – subparagraph 1 (new)
(d) have not provided an audit to the same very large online platform for more than three consecutive years.
Amendment 525 #
Proposal for a regulation
Article 28 – paragraph 3 – point f
Article 28 – paragraph 3 – point f
(f) where the audit opinion is not posiegative, operational recommendations on specific measures to achieve compliance. and risk- based remediation timelines with a focus on rectifying issues that have the potential to cause most harm to users of the service as a priority;
Amendment 526 #
Proposal for a regulation
Article 28 – paragraph 3 – subparagraph 1 (new)
Article 28 – paragraph 3 – subparagraph 1 (new)
(g) where the organisations that perform the audits do not have enough information to conclude an opinion due to the novelty of the issues audited, a disclaimer shall be given.
Amendment 534 #
Proposal for a regulation
Article 30 – title
Article 30 – title
Additional online advertising transparency for advertisements related to public welfare
Amendment 536 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2 for all advertisements related to public health, public security, civil discourse, political participation and equality, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 543 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. The delegated acts should also lay out the technical conditions needed to ensure confidentiality and security of information by the vetted researchers once they acquire access to the data, including guidelines for academics who wish to publish findings based on the confidential data acquired. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 560 #
Proposal for a regulation
Article 39 – paragraph 1 – subparagraph 1 (new)
Article 39 – paragraph 1 – subparagraph 1 (new)
Member States shall designate the status of Digital Services Coordinator based on the following criteria: (a) the authority has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; (b) it represents collective interests and is independent from any online platform; (c) it has the capacity to carry out its activities in a timely, diligent and objective manner.
Amendment 574 #
Proposal for a regulation
Article 44 – paragraph 1
Article 44 – paragraph 1
1. Digital Services Coordinators shall draw up an annual reports on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission and to the Board.
Amendment 575 #
Proposal for a regulation
Article 44 – paragraph 2 – point b a (new)
Article 44 – paragraph 2 – point b a (new)
(ba) measures taken by the Digital Service Coordinators to ensure that highly sensitive information and business secrets are kept confidential;
Amendment 576 #
Proposal for a regulation
Article 44 – paragraph 2 – point b b (new)
Article 44 – paragraph 2 – point b b (new)
(bb) an assessment of the interpretation of the Country of Origin principle in the supervisory and enforcement activities of the Digital Services Coordinators, especially in regards to Article 45 of this Regulation.
Amendment 607 #
Proposal for a regulation
Article 57 – paragraph 1
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms, without prejudice to Directive (EU) 2016/943 on trade secrets.
Amendment 648 #
Proposal for a regulation
Article 69 – paragraph 2
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
Amendment 649 #
Proposal for a regulation
Article 69 – paragraph 3
Article 69 – paragraph 3
3. The delegation of power referred to in Articles 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
Amendment 650 #
Proposal for a regulation
Article 69 – paragraph 5
Article 69 – paragraph 5
5. A delegated act adopted pursuant to Articles 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.
Amendment 655 #
Proposal for a regulation
Article 74 – paragraph 2 – introductory part
Article 74 – paragraph 2 – introductory part
2. It shall apply from [date - threnine months after its entry into force].