47 Amendments of Sophia IN 'T VELD related to 2020/0361(COD)
Amendment 168 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance wi, neither de jure nor de facto. A de facto obligation would occur if the non-implementation of a general or preventive monitoring infrastructure would be uneconomical, for instance due to the significant extra cost of alternative human oversight necessities or due to the the conditions established in this Regulationreat of significant damage payments. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
Amendment 280 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
Article 1 – paragraph 5 – point c
Amendment 299 #
Proposal for a regulation
Article 3 – paragraph 3
Article 3 – paragraph 3
Amendment 306 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
Article 5 – paragraph 1 – point a
(a) does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; or
Amendment 309 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content.
Amendment 323 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
No general obligation, neither de jure nor de facto, to monitor the information which providers of intermediary services transmit or store, nor actively to seekto seek or prevent facts or circumstances indicating illegal activity shall be imposed on those providers.
Amendment 326 #
Proposal for a regulation
Article 7 a (new)
Article 7 a (new)
Article 7 a No limitation of anonymity No general obligation to limit the anonymous or pseudonymous use of their services shall be imposed on providers of intermediary services.
Amendment 327 #
Proposal for a regulation
Article 7 b (new)
Article 7 b (new)
Article 7 b No limitation of encryption and security No general obligation to limit the level of their security and encryption measures shall be imposed on providers of intermediary services.
Amendment 328 #
Proposal for a regulation
Article 7 c (new)
Article 7 c (new)
Article 7 c No general and indiscriminate retention of data No general obligation to retain personal data of the recipients of their services shall be imposed on providers of intermediary services. Any obligation to retain data shall be limited to what is strictly necessary with respect to the categories of data to be retained, the means of communication affected, the persons concerned and the retention period adopted.
Amendment 335 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – introductory part
Article 8 – paragraph 2 – point a – introductory part
(a) the orders contains the following elements:
Amendment 397 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and, unambiguous, very easily comprehensible language and shall be publicly available in an easily accessible format.
Amendment 436 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their hosting service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively byon a case by case basis, exclusively by non-automated electronic means.
Amendment 437 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can unambiguously, without reasonable doubt, identify the manifest illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 444 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that are submitted by a competent judicial authority of the Member State where the hosting provider is established or its legal representative resides or is established and that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
Amendment 460 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. The mechanism referred to in paragraph 1 shall be provided free of charge.Where notices are manifestly unfounded or excessive, in particular because of their repetitive character, the provider of hosting services may either: (a) charge a reasonable fee taking into account the administrative costs of processing the notices;or (b) refuse to act on the request. The provider of hosting services shall bear the burden of demonstrating the manifestly unfounded or excessive character of the notice.
Amendment 474 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means used in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
Amendment 514 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
The first subparagraph is without prejudice to the right of the recipient concerned to redress against the decision of the online platform before a court in accordance with the applicable law, as well as the right of the online platform concerned to redress against the decision of the out-of- court dispute settlement body before a court in accordance with the applicable law.
Amendment 559 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shallProviders of hosting services may suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content through their hosting services.
Amendment 565 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shallmay suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 594 #
Proposal for a regulation
Article 24 – title
Article 24 – title
Online advertising transparency and control
Amendment 602 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
Article 24 – paragraph 1 – point c
(c) clear, meaningful and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed and the logic involved.
Amendment 607 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
Amendment 609 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Article 24 – paragraph 1 b (new)
3. Where a recipient exercises any of the rights referred to points (a),(c) or (d) in paragraph 2, the online platform must immediately cease displaying advertisements using the personal data concerned or using parameters which were set using this data.
Amendment 620 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter and always before launching new services, any significant systemic risks stemming from the design, functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 622 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the design, functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 625 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 641 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
Amendment 659 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1 a. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in Article 28(3).
Amendment 666 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 668 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrenteach of the systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 672 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparBefore adopting those guidelines the Commission shall organise public consultations and ask for the consent of the European Parliament.
Amendment 682 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III, in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27;
Amendment 701 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
Article 28 – paragraph 3 – point f a (new)
(f a) a description of specific elements that could not be audited, and an explanation of why these could not be audited;
Amendment 702 #
Proposal for a regulation
Article 28 – paragraph 3 – point f b (new)
Article 28 – paragraph 3 – point f b (new)
(f b) where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such a conclusive opinion;
Amendment 734 #
Proposal for a regulation
Article 31 – paragraph 1
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only request, access, and use that data for those purposes.
Amendment 735 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
Amendment 737 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment, three Digital Services Coordinators of destination, or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1).
Amendment 742 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
Amendment 746 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
Amendment 750 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 753 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons:
Amendment 792 #
Proposal for a regulation
Article 38 – paragraph 3 – introductory part
Article 38 – paragraph 3 – introductory part
3. Member States shall designate the Digital Services Coordinators within two months from the date of entry into force of this Regulation. When a Member State is subject to a procedure referred to in Article 7(1) or 7(2) of the Treaty on European Union, the Commission shall confirm that the Digital Services Coordinator proposed by that Member State fulfils the requirements laid down in Article 39 before that Digital Services Coordinator can be designated.
Amendment 798 #
Proposal for a regulation
Article 41 – paragraph 1 – point a
Article 41 – paragraph 1 – point a
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period, unless that information is known to be protected by immunities and privileges in accordance with the applicable law;
Amendment 801 #
Proposal for a regulation
Article 41 – paragraph 3 – introductory part
Article 41 – paragraph 3 – introductory part
3. Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of intermediaryhosting services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures:
Amendment 802 #
Proposal for a regulation
Article 41 – paragraph 3 – point b
Article 41 – paragraph 3 – point b
(b) where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving a direct and imminent threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access to that infringement of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.
Amendment 819 #
Proposal for a regulation
Article 45 – paragraph 7 a (new)
Article 45 – paragraph 7 a (new)
Amendment 855 #
Proposal for a regulation
Article 52 – paragraph 1
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period., unless that information is known to be protected by immunities and privileges in accordance with the applicable law;