58 Amendments of Michal ŠIMEČKA related to 2020/0361(COD)
Amendment 168 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance wi, neither de jure nor de facto. A de facto obligation would occur if the non-implementation of a general or preventive monitoring infrastructure would be uneconomical, for instance due to the significant extra cost of alternative human oversight necessities or due to the the conditions established in this Regulationreat of significant damage payments. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
Amendment 185 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors, women and vulnerable users, such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 205 #
Proposal for a regulation
Recital 41
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
Amendment 280 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
Article 1 – paragraph 5 – point c
Amendment 299 #
Proposal for a regulation
Article 3 – paragraph 3
Article 3 – paragraph 3
Amendment 306 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
Article 5 – paragraph 1 – point a
(a) does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; or
Amendment 309 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content.
Amendment 323 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
No general obligation, neither de jure nor de facto, to monitor the information which providers of intermediary services transmit or store, nor actively to seekto seek or prevent facts or circumstances indicating illegal activity shall be imposed on those providers.
Amendment 326 #
Proposal for a regulation
Article 7 a (new)
Article 7 a (new)
Article 7 a No limitation of anonymity No general obligation to limit the anonymous or pseudonymous use of their services shall be imposed on providers of intermediary services.
Amendment 327 #
Proposal for a regulation
Article 7 b (new)
Article 7 b (new)
Article 7 b No limitation of encryption and security No general obligation to limit the level of their security and encryption measures shall be imposed on providers of intermediary services.
Amendment 328 #
Proposal for a regulation
Article 7 c (new)
Article 7 c (new)
Article 7 c No general and indiscriminate retention of data No general obligation to retain personal data of the recipients of their services shall be imposed on providers of intermediary services. Any obligation to retain data shall be limited to what is strictly necessary with respect to the categories of data to be retained, the means of communication affected, the persons concerned and the retention period adopted.
Amendment 335 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – introductory part
Article 8 – paragraph 2 – point a – introductory part
(a) the orders contains the following elements:
Amendment 342 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, health and trust of the recipients of the service, including minors, women and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empowerprovide recourse to recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 397 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and, unambiguous, very easily comprehensible language and shall be publicly available in an easily accessible format.
Amendment 436 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their hosting service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively byon a case by case basis, exclusively by non-automated electronic means.
Amendment 437 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can unambiguously, without reasonable doubt, identify the manifest illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 444 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that are submitted by a competent judicial authority of the Member State where the hosting provider is established or its legal representative resides or is established and that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
Amendment 460 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. The mechanism referred to in paragraph 1 shall be provided free of charge.Where notices are manifestly unfounded or excessive, in particular because of their repetitive character, the provider of hosting services may either: (a) charge a reasonable fee taking into account the administrative costs of processing the notices;or (b) refuse to act on the request. The provider of hosting services shall bear the burden of demonstrating the manifestly unfounded or excessive character of the notice.
Amendment 474 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means used in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
Amendment 478 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service through the submission of abusive notices, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 480 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
Article 15 – paragraph 4 a (new)
4 a. Paragraphs 2 and 4 shall not apply to providers of hosting services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 513 #
Proposal for a regulation
Article 18 – paragraph 1 – introductory part
Article 18 – paragraph 1 – introductory part
1. Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of- court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 514 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
The first subparagraph is without prejudice to the right of the recipient concerned to redress against the decision of the online platform before a court in accordance with the applicable law, as well as the right of the online platform concerned to redress against the decision of the out-of- court dispute settlement body before a court in accordance with the applicable law.
Amendment 559 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shallProviders of hosting services may suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content through their hosting services.
Amendment 565 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shallmay suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 620 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter and always before launching new services, any significant systemic risks stemming from the design, functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 666 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 672 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparBefore adopting those guidelines the Commission shall organise public consultations and ask for the consent of the European Parliament.
Amendment 727 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
Article 30 – paragraph 2 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed, unless that information concerns providers of intermediary services that qualify as micro, small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC;
Amendment 734 #
Proposal for a regulation
Article 31 – paragraph 1
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only request, access, and use that data for those purposes.
Amendment 735 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
Amendment 742 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
Amendment 746 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
Amendment 750 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 753 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons:
Amendment 792 #
Proposal for a regulation
Article 38 – paragraph 3 – introductory part
Article 38 – paragraph 3 – introductory part
3. Member States shall designate the Digital Services Coordinators within two months from the date of entry into force of this Regulation. When a Member State is subject to a procedure referred to in Article 7(1) or 7(2) of the Treaty on European Union, the Commission shall confirm that the Digital Services Coordinator proposed by that Member State fulfils the requirements laid down in Article 39 before that Digital Services Coordinator can be designated.
Amendment 798 #
Proposal for a regulation
Article 41 – paragraph 1 – point a
Article 41 – paragraph 1 – point a
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period, unless that information is known to be protected by immunities and privileges in accordance with the applicable law;
Amendment 801 #
Proposal for a regulation
Article 41 – paragraph 3 – introductory part
Article 41 – paragraph 3 – introductory part
3. Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of intermediaryhosting services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures:
Amendment 802 #
Proposal for a regulation
Article 41 – paragraph 3 – point b
Article 41 – paragraph 3 – point b
(b) where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving a direct and imminent threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access to that infringement of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.
Amendment 819 #
Proposal for a regulation
Article 45 – paragraph 7 a (new)
Article 45 – paragraph 7 a (new)
Amendment 847 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or upon its own initiative after consulting the Boardrequest of at least three of the Digital Services Coordinators of destination, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
Amendment 855 #
Proposal for a regulation
Article 52 – paragraph 1
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period., unless that information is known to be protected by immunities and privileges in accordance with the applicable law;
Amendment 856 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 864 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
Amendment 882 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
Amendment 898 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in article 27.1, it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in article 28.3.
Amendment 914 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in article 27
Amendment 1112 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
Amendment 1124 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
Amendment 1128 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
Article 51 – paragraph 2 – introductory part
2. Where then Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
Amendment 1550 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 1563 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
Amendment 1606 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
Amendment 1626 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in Article 28(3).
Amendment 1658 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27
Amendment 2099 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
Amendment 2120 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
Amendment 2130 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Article 51 – paragraph 2 – subparagraph 1
Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.