BETA

107 Amendments of Hilde VAUTMANS related to 2020/0361(COD)

Amendment 29 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to gender equality and non- discrimination. Children, especially girls, have specific rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child. As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
2021/07/15
Committee: FEMM
Amendment 49 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as health, including mental health, the safety and trust of the recipients of the service, including minors and vulnerable users, women, LGBTIQ+ people and vulnerable users such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. The World Health Organisation defines ‘health’ as a state of complete physical, mental and social well- being and not merely the absence of disease or infirmity. This definition supports the fact that the development of new technologies might bring new health risks to users, in particular for children and women, such as psychological risk, development risks, mental risks, depression, loss of sleep, or altered brain function.
2021/07/15
Committee: FEMM
Amendment 59 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
2021/07/15
Committee: FEMM
Amendment 69 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/15
Committee: FEMM
Amendment 86 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including gender equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/07/15
Committee: FEMM
Amendment 92 #
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
(d a) ‘child’ means any natural person under the age of 18;
2021/07/15
Committee: FEMM
Amendment 100 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1 a. Providers of intermediary services shall ensure their terms and conditions are age-appropriate, promote gender equality and the rights of LGBTIQ+ people and meet the highest European or International standards, pursuant to Article 34.
2021/07/15
Committee: FEMM
Amendment 105 #
Proposal for a regulation
Article 12 a (new)
Article 12 a Child impact assessment 1. All providers shall assess whether their services are accessed by, likely to be accessed by, or impact children, especially girls. Providers of services likely to impact children, especially girls, shall identify, analyse and assess, during the design and development of new services, on an ongoing basis and at least once a year, any systemic risks stemming from the functioning and use of their services in the Union for children, especially girls. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments shall also include the following systemic risks: (a) the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; (b) any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and in the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No.25 as regards the digital environment; (c) any negative effects on the the right to gender equality, as enshrined in Article 23 of the Charter, particularly the right to live free from violence as envisaged by the Council of Europe Convention on preventing and combating violence against women and girls (Istanbul Convention); (d) any negative effects on the right to non-discrimination, as enshrined in Article 21 of the Charter; (e) any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on children's rights, especially of girls. 2. When conducting such impact assessments, providers of intermediary services likely to impact children, especially girls, shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child, especially of girls.
2021/07/15
Committee: FEMM
Amendment 106 #
Article 12 b Mitigation of risks to children, especially girls Providers of intermediary services likely to impact children, especially girls, shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 12 a. Such measures shall include, where applicable: (a) implementing mitigation measures identified in Article 27 with regard for children’s best interests; (b) adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; (c) implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; (d) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child and gender equality; (e) ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; (f) preventing profiling of children, including for commercial purposes like targeted advertising; (g) ensuring published terms are age appropriate and uphold children’s rights and gender equality; (h) providing child-friendly and inclusive mechanisms for remedy and redress, including easy access to expert advice and support.
2021/07/15
Committee: FEMM
Amendment 110 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, gender and (if children) the age of complainants, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/07/15
Committee: FEMM
Amendment 111 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1 a. Providers of intermediary services that impact children, especially girls, shall publish, at least once a year: (a) child impact assessments to identify known harms, unintended consequences and emerging risks; these impact assessments shall comply with the standards outlined in Article 34; (b) clear, easily comprehensible and detailed reports outlining the gender equality and child risk mitigation measures undertaken, their efficacy and any outstanding actions required; these reports shall comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child-centred design that equally promotes gender equality.
2021/07/15
Committee: FEMM
Amendment 124 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling and redress systems are easy to access, and user-friendly, including for children, especially girls and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.
2021/07/15
Committee: FEMM
Amendment 128 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/15
Committee: FEMM
Amendment 129 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. Children have specific rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
2021/06/10
Committee: LIBE
Amendment 130 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/15
Committee: FEMM
Amendment 144 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, search engine, livestreaming platforms, messaging services or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/06/10
Committee: LIBE
Amendment 144 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/15
Committee: FEMM
Amendment 146 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admission to a user group, such information should only be considered to be publicly available when users seeking to access such information are automatically registered or admitted without human intervention to decide or select the users to whom access is granted. The mere possibility to create groups of users of a given service, including a messagings service, should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/06/10
Committee: LIBE
Amendment 149 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1 a. Where a very large online platform decides not to put in place any of the mitigation measures listed in article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report referred to in Article 28(3).
2021/07/15
Committee: FEMM
Amendment 151 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27;
2021/07/15
Committee: FEMM
Amendment 158 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
2 a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child and the right to gender equality, observance of which, once adopted, will be mandatory, at least for the following: (a) age assurance and age verification pursuant to Article 13; (b) child impact assessments pursuant to Article 13; (c) age-appropriate terms and conditions that equally promote gender equality pursuant to Article 12; (d) child-centred design that equally promotes gender equality and pursuant to Article 13.
2021/07/15
Committee: FEMM
Amendment 162 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken in order to detect, identify and act on illegal pieces of content on a voluntary basis, should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/06/10
Committee: LIBE
Amendment 167 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Member states should however have the possibility to require from service providers, who host information provided by users of their service, to apply diligent duty of care.
2021/06/10
Committee: LIBE
Amendment 167 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/15
Committee: FEMM
Amendment 168 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/15
Committee: FEMM
Amendment 169 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2. Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/15
Committee: FEMM
Amendment 171 #
Proposal for a regulation
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to thto ensure the effective processing of those orders.
2021/06/10
Committee: LIBE
Amendment 175 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union or national law, or international law and the interests of international comity.
2021/06/10
Committee: LIBE
Amendment 177 #
Proposal for a regulation
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information under either Union or national law, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
2021/06/10
Committee: LIBE
Amendment 185 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors, women and vulnerable users, such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/20
Committee: JURI
Amendment 187 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. Children have specific rights enshrined in Article 24 of the Charter of Fundamental Rights of the European Union and in the United Nations Convention on the Rights of the Child. As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
2021/07/08
Committee: IMCO
Amendment 188 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means that have been proven to be efficient, proportionate and accurate, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/06/10
Committee: LIBE
Amendment 198 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content for instance, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/06/10
Committee: LIBE
Amendment 201 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and reliable safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/06/10
Committee: LIBE
Amendment 203 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, notably when it concerns vulnerable users such as children, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/06/10
Committee: LIBE
Amendment 204 #
Proposal for a regulation
Recital 48 a (new)
(48 a) In order to prevent situations such as the one which led to the murder of Samuel Paty, where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall remove or disable the content and promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.1a _________________ 1a1a Samuel Paty, a French history teacher, was assassinated outside the Bois d'Aulne secondary school near Paris on October 16, 2020, following an online hate campaign after he had taught students about freedom of expression and blasphemy
2021/06/10
Committee: LIBE
Amendment 205 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
2021/07/20
Committee: JURI
Amendment 209 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/06/10
Committee: LIBE
Amendment 210 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, t specifically regarding disinformation, misinformation, hate speech or any other types of harmful content. There being no alternative and less restrictive measures that would effectively achieve the same result.
2021/06/10
Committee: LIBE
Amendment 213 #
Proposal for a regulation
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause real societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
2021/06/10
Committee: LIBE
Amendment 216 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or, hampering competition or the way platforms' terms and conditions including content moderation policies, are enforced, including through automatic means. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discoursefundamental rights, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/06/10
Committee: LIBE
Amendment 223 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions as well as making content moderation policies, as well as the way they are enforced fully transparent for the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/06/10
Committee: LIBE
Amendment 229 #
Proposal for a regulation
Recital 60
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate, efficient and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
2021/06/10
Committee: LIBE
Amendment 241 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms, such as the dissemination of illegal and harmful content, brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/06/10
Committee: LIBE
Amendment 246 #
Proposal for a regulation
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation, and encourage online platforms to follow those codes. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
2021/06/10
Committee: LIBE
Amendment 249 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area forspect which needs to be considerationed is the possible negative impacts of systemic risks on society and democracy, such as disinformation, harmful content, in particular hate speech, or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
2021/06/10
Committee: LIBE
Amendment 251 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, since the Commission willhas issued guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan, in May 2021.
2021/06/10
Committee: LIBE
Amendment 253 #
Proposal for a regulation
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross- border response in the online environment in the public interest. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.
2021/06/10
Committee: LIBE
Amendment 319 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union or national law, including those set out in this Regulation.
2021/06/10
Committee: LIBE
Amendment 342 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, health and trust of the recipients of the service, including minors, women and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empowerprovide recourse to recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 345 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as health – including mental health, the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 367 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasonspecific reasons such as ones related to the prevention, investigation, detection and prosecution of criminal offences;
2021/06/10
Committee: LIBE
Amendment 388 #
Proposal for a regulation
Article 9 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements falling under national criminal procedural law in conformity with Union law.
2021/06/10
Committee: LIBE
Amendment 402 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter, as well as the national and Union law.
2021/06/10
Committee: LIBE
Amendment 412 #
Proposal for a regulation
Article 12 a (new)
Article 12 a Child impact assesment 1. All providers must assess whether their services are accessed by, likely to be accessed by or impact on children, defined as persons under the age of 18. Providers of services likely to be accessed by or impact on children shall identify, analyse and assess, during the design and development of new services and at least once a year thereafter, any systemic risks stemming from the functioning and use made of their services in the Union by children. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments should also include the following systemic risks: a. the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; b. any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment; c. any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection or rights of children; 2. When conducting child impact assessments, providers of intermediary services likely to impact children shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child.
2021/06/10
Committee: LIBE
Amendment 414 #
Proposal for a regulation
Article 12 b (new)
Article 12 b Mitigation of risks to children Providers of intermediary services likely to impact children shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 13 (12 a new). Such measures shall include, where applicable: a. implementing mitigation measures identified in Article 27 with regard for children’s best interests; b. adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; c. implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; d. adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child; e. ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; f. preventing profiling, including for commercial purposes like targeted advertising; g. ensuring published terms are age appropriate and uphold children’s rights; h. providing child-friendly mechanisms for remedy and redress, including easy access to expert advice and support;
2021/06/10
Committee: LIBE
Amendment 422 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the age of complainants (if minors), the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/06/10
Committee: LIBE
Amendment 427 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
2 a. Providers of intermediary services that impact on children shall publish, at least once a year: a. child impact assessments to identify known harms, unintended consequences and emerging risk, pursuant to Article 13 (Art. 12 a new).The child impact assessments must comply with the standards outlined in Article 34; b. clear, easily comprehensible and detailed reports outlining the child risk mitigation measures undertaken pursuant to Article 14, their efficacy and any outstanding actions required. These reports must comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child-centred design. The content of these reports must be verifiable by independent audit; data sets and source code must be made available at the request of the regulator.
2021/06/10
Committee: LIBE
Amendment 433 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means. For instance a clearly identifiable banner, allowing the users of those hosting services to notify quickly and easily the providers of hosting services, of illegal content or manifestly illegal content when it is encountered by users, as well as putting at users’ disposal, including children, information regarding what is considered illegal content under Union and national law. They shall also, where applicable, inform users, including children, of available public tools in the Member State which provides a service to signal manifestly illegal content to the authorities.
2021/06/10
Committee: LIBE
Amendment 478 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service through the submission of abusive notices, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 489 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content under national or Union law, or incompatible with its terms and conditions:
2021/06/10
Committee: LIBE
Amendment 557 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. , unless those manifestly illegal contents were due to wrongful notices and complaints as described in point 2 of this article.
2021/06/10
Committee: LIBE
Amendment 559 #
Proposal for a regulation
Article 12 a (new)
Article 12a General Risk Assessment and Mitigation Measures 1. Providers of intermediary services shall identify, analyse and assess, at least once a year and at each significant revision of a service they provide thereafter, the potential misuse or other risks stemming from the functioning and use made of their services in the Union. Such a general risk assessment shall be specific to each of their services and shall include at least risks related to the dissemination of illegal content through their services and any contents that might have a negative effect on potential recipients of the service, especially minors and gender equality. 2. Providers of intermediary services shall wherever possible, attempt to put in place reasonable, proportionate and effective mitigation measures to the risk identified in line with applicable law and their terms and conditions. 3. Where the identified risk relations to minor recipients of the service, without regard to if the minor is acting with respect to the terms and conditions, mitigation measures shall include, where needed and applicable: (a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure those prioritise the best interests of the minor; (b) adapting or removing system design features that expose or promote to minors to content, contact, conduct and contract risks that impair the physical, mental or moral development; (c) ensuring the highest levels of privacy, safety, consumer protection and security by design and default for individual recipients of the service under the age of 18. (d) if a service is targeted at minors, provide child-friendly mechanisms for remedy and redress, including easy access to expert advice and support. 4. Providers of intermediary services shall, upon request, explain to the competent Digital Services Coordinator, how it undertook this risk assessment and what mitigation measures it undertook.
2021/07/19
Committee: JURI
Amendment 578 #
Proposal for a regulation
Article 21 – paragraph 1
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall remove or disable the content and promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.
2021/06/10
Committee: LIBE
Amendment 591 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing clearly between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
2021/06/10
Committee: LIBE
Amendment 632 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how and whether their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/06/10
Committee: LIBE
Amendment 640 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to address the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/06/10
Committee: LIBE
Amendment 666 #
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
(da) ‘child’ means any natural person under the age of 18;
2021/07/08
Committee: IMCO
Amendment 739 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1), which is in the public interest.
2021/06/10
Committee: LIBE
Amendment 772 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1 a. 2 (new).The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child, observance of which, once adopted, will be mandatory, at least for the following: a. age assurance and age verification pursuant to Articles 12 a (new) and 12 b (new) and 13; b. child impact assessments pursuant to Articles 12 a (new) and 13; c. age-appropriate terms and conditions pursuant to Article 12; d. child-centred design pursuant to Articles 12 b (new) and 13.
2021/06/10
Committee: LIBE
Amendment 774 #
Proposal for a regulation
Article 35 – paragraph 1
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content under Union and national law and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
2021/06/10
Committee: LIBE
Amendment 785 #
Proposal for a regulation
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall supervise the monitoring of the application of those codes two years after the application of this Regulation.
2021/06/10
Committee: LIBE
Amendment 840 #
Proposal for a regulation
Article 24 – paragraph 1 e (new)
Online platforms shall not be allowed to resort to cross-device and cross-service combination of data processed inside or outside the platform.
2021/07/19
Committee: JURI
Amendment 856 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/19
Committee: JURI
Amendment 864 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/19
Committee: JURI
Amendment 882 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/19
Committee: JURI
Amendment 898 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in article 27.1, it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in article 28.3.
2021/07/19
Committee: JURI
Amendment 908 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, mayand following public consultations shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
2021/07/19
Committee: JURI
Amendment 919 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which have been selected by the Commission and:
2021/07/19
Committee: JURI
Amendment 937 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall ensure their terms and conditions are age-appropriate and meet the highest European or International standards, pursuant to Article 34.
2021/07/08
Committee: IMCO
Amendment 945 #
Proposal for a regulation
Article 30 – title
Additional online advertising transparency and protection
2021/07/19
Committee: JURI
Amendment 966 #
Proposal for a regulation
Article 12 a (new)
Article 12a General Risk Assessment and Mitigation Measures 1. Providers of intermediary services shall identify, analyse and assess, at least once and at each significant revision of a service thereafter, the potential misuse or other risks stemming from the functioning and use made of their services in the Union. Such a general risk assessment shall be specific to their services and shall include at least risks related to the dissemination of illegal content through their services and any contents that might have a negative effect on potential recipients of the service, especially minors and gender equality. 2. Providers of intermediary services which identify potential risks shall, wherever possible, attempt to put in place reasonable, proportionate and effective mitigation measures in line with their terms and conditions. 3. Where the identified risk relates to minors, without regard to if the child is acting with respect to the terms and conditions, mitigation measures shall include, taking into account the industry standards referred to in Article 34, where needed and applicable: (a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure those prioritise the best interests of the child; (b) adapting or removing system design features that expose or promote to children to content, contact, conduct and contract risks; (c) ensuring the highest levels of privacy, safety, and security by design and default for children including any profiling or use of data for commercial purposes; (d) if a service is targeted at children, provide child-friendly mechanisms for remedy and redress, including easy access to expert advice and support. 4. Providers of intermediary services shall, upon request, explain to the Digital Services Coordinator of the Member State of establishment, how it undertook this risk assessment and what voluntary mitigation measures it undertook.
2021/07/08
Committee: IMCO
Amendment 968 #
Proposal for a regulation
Article 12 a (new)
Article 12a Child impact assessment 1. All providers must assess whether their services are accessed by, likely to be accessed by or impact on children. Providers of services likely to be accessed by or impact on children shall identify, analyse and assess, during the design and development of new services, on an ongoing basis and at least once a year thereafter, any systemic risks stemming from the functioning and use made of their services in the Union by children. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments should also include the following systemic risks: (a) the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; (b) any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No.25 as regards the digital environment; (c) any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection or rights of children; 2. When conducting child impact assessments, providers of intermediary services likely to impact children shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child.
2021/07/08
Committee: IMCO
Amendment 973 #
Proposal for a regulation
Article 12 b (new)
Article 12b Mitigation of risks to children Providers of intermediary services likely to impact children shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 13 (12 a new). Such measures shall include, where applicable: (a) implementing mitigation measures identified in Article 27 with regard for children’s best interests; (b) adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; (c) implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; (d) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child; (e) ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; (f) preventing profiling, including for commercial purposes like targeted advertising; (g) ensuring published terms are age appropriate and uphold children’s rights; (h) providing child-friendly mechanisms for remedy and redress, including easy access to expert advice and support;
2021/07/08
Committee: IMCO
Amendment 992 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the age of complainants (if children), the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/07/08
Committee: IMCO
Amendment 997 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1a. Providers of intermediary services that impact on children shall publish, at least once a year: (a) child impact assessments to identify known harms, unintended consequences and emerging risk. The child impact assessments must comply with the standards outlined in Article 34; (b) clear, easily comprehensible and detailed reports outlining the child risk mitigation measures undertaken, their efficacy and any outstanding actions required. These reports must comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child- centred design.
2021/07/08
Committee: IMCO
Amendment 1010 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1a. The Commission shall support and promote the development and implementation of standards set by relevant European and international standardisation bodies, subject to transparent, multi-stakeholder and inclusive processes in line with Regulation (EU) 1025/2012, for the protection and promotion of the rights of the child, observance of which, once adopted will be mandatory for very large online platforms, at least for the following: (a) Age assurance and age verification; (b) Child impact assessments; (c) Child-centred and age-appropriate design; (d) Child-centred and age-appropriate terms and conditions.
2021/07/19
Committee: JURI
Amendment 1042 #
Proposal for a regulation
Article 36 a (new)
Article 36a Codes of conduct for the protection of minors 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms and other relevant services providers and organisations representing minors, parents and civil society organisations or relevant authorities to further contribute to the protection of minors on online. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective protection of minors online, which respects their right as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment. The Commission shall aim to ensure that the codes of conduct address at least: (a) Age verification and age assurance models, taking into account the industry standards referred to in article 34. (b) Child-centred and age-appropriate design, taking into account the industry standards referred to in article 34. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.
2021/07/19
Committee: JURI
Amendment 1112 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/19
Committee: JURI
Amendment 1124 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/19
Committee: JURI
Amendment 1128 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2. Where then Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/19
Committee: JURI
Amendment 1178 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling and redress systems are easy to access, and user-friendly, including for children, and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.
2021/07/08
Committee: IMCO
Amendment 1509 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
2. The profiling of children for commercial purposes, including targeted or pernolised advertising, is prohibited in compliance with the industry-standards laid down in Article 34 and Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1550 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1563 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1606 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1626 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in Article 28(3).
2021/07/08
Committee: IMCO
Amendment 1658 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27
2021/07/08
Committee: IMCO
Amendment 1664 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which have been selected by the Commission and:
2021/07/08
Committee: IMCO
Amendment 1711 #
Proposal for a regulation
Article 30 – title
Additional online advertising transparency and protection
2021/07/08
Committee: IMCO
Amendment 1738 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. Very large online platforms shall be prohibited from profiling children under the age of 16 for commercial practices, including personalized advertising, in compliance with industry- standards laid down in Article 34 and Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1835 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child, observance of which, once adopted will be mandatory for very large online platforms, at least for the following: (a) age assurance and age verification; (b) child impact assessments; (c) child-centred and age-appropriate design; (d) child-centred and age-appropriate terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1839 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
2a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child, observance of which, once adopted, will be mandatory, at least for the following: (a) age assurance and age verification pursuant to Article 13; (b) child impact assessments pursuant to Article 13; (c) age-appropriate terms and conditions pursuant to Article 12; (d) child-centred design pursuant to Article 13.
2021/07/08
Committee: IMCO
Amendment 1893 #
Proposal for a regulation
Article 36 a (new)
Article 36a Codes of conduct for the protection of minors 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms and other relevant services providers and organisations representing minors, parents and civil society organisations or relevant authorities to further contribute to the protection of minors on online. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective protection of minors online, which respects their right as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment. The Commission shall aim to ensure that the codes of conduct address at least: (a) age verification and age assurance models, taking into account the industry standards referred to in article 34. (b) child-centred and age-appropriate design, taking into account the industry standards referred to in Article 34. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of the Regulation and their application no later than six months after that date.
2021/07/08
Committee: IMCO
Amendment 2099 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/08
Committee: IMCO
Amendment 2120 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/08
Committee: IMCO
Amendment 2130 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/08
Committee: IMCO