27 Amendments of Nicola PROCACCINI related to 2020/0361(COD)
Amendment 131 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other personsmeans that Union citizens and other persons should be allowed, in an appropriate manner, to exercise online their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non- discrimination. To that end, it is unacceptable for the terms and conditions of such providers to be able to serve as a basis for the removal or disabling of content that is not manifestly illegal.
Amendment 141 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activitiesappropriately differentiated from the concept of "potentially illegal content". In particular, thate concept of "illegal content" should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 163 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against manifestly illegal content related to serious crimes that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner, diligently and never on a discretionary basis. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 164 #
Proposal for a regulation
Recital 26
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.
Amendment 169 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or, administrative or police authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of informationshould be the only bodies entitled to decide on the removal of specific content, except in the case of manifestly illegal content related to serious crimes, which might require immediate intervention by the intermediary service provider. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
Amendment 183 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly importsignificant role in tackling illegal content onlineunder existing law, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulationmechanisms that facilitate the detection of illegal content, especially following a report from a competent national public authority.
Amendment 185 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where aA hosting service provider decidesshould not decide by itself to remove or disable information provided by a recipient of the service, for instanceeither following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. A view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression, unless the content is manifestly illegal and related to serious crimes. Even in the latter case, available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 194 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal contentcontent that is illegal under existing law can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43. _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53he national authorities responsible are followed up immediately. Such authorities must be solely public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’).
Amendment 200 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 286 #
Proposal for a regulation
Article 2 – paragraph 1 – point g – point i (new)
Article 2 – paragraph 1 – point g – point i (new)
(i) ‘potentially illegal content’ means content the unlawfulness of which is not beyond reasonable doubt, but which contains suspicious indicators;
Amendment 317 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulationporting content that is illegal under existing law to the national public authorities responsible.
Amendment 395 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions, specifying that they are not regulatory requirements. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. The terms and conditions laid down by providers of intermediary services shall not serve as a basis for the arbitrary removal of online content, as only the national public authorities responsible are entitled to order that online content be disabled, with the exception of manifestly illegal content related to serious crimes, which calls for immediate action by those providers.
Amendment 434 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic meansthe detection of content that is illegal under existing law, especially following a report from a competent national public authority.
Amendment 438 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
Amendment 490 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:.
Amendment 533 #
Proposal for a regulation
Article 19
Article 19
Amendment 535 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
Amendment 539 #
Proposal for a regulation
Article 19 – paragraph 2
Article 19 – paragraph 2
Amendment 541 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
Amendment 543 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
Amendment 545 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
Amendment 547 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
Amendment 548 #
Proposal for a regulation
Article 19 – paragraph 4
Article 19 – paragraph 4
Amendment 549 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
Amendment 552 #
Proposal for a regulation
Article 19 – paragraph 6
Article 19 – paragraph 6
Amendment 554 #
Proposal for a regulation
Article 19 – paragraph 7
Article 19 – paragraph 7
Amendment 560 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide content that is manifestly illegal contentby law.