61 Amendments of Beata KEMPA related to 2020/0361(COD)
Amendment 130 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression andincluding the freedom to receive and impart information and ideas in an open and democratic society, freedom of polemic or controversial views in the course of public debate, freedom of media and access to information and the freedom to conduct a business, and the right to non- discrimination. , and the right to protect personal data
Amendment 140 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and alsunderpin the general idea that what is illegal offline should also be illegal online. The concept should be defined broadly to covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 145 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Concept of 'dissemination to the public' should not apply to cloud services, including business-to-business cloud services, with respect to which the service provider has no contractual rights concerning what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no technical capability to remove specific content stored by their customers or the end-users of their services. Where a service provider offers several services, this Regulation should be applied only in respect of the services that fall within its scope. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 155 #
Proposal for a regulation
Recital 20
Recital 20
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally andor the main purpose of which is to engage in or facilitate such activities should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
Amendment 156 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression and the freedom to receive and impart information and ideas in an open and democratic society and the freedom and pluralism of the media. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 176 #
Proposal for a regulation
Recital 32
Recital 32
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about non personal information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
Amendment 189 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means afer human verification of such notice, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 195 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 205 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessarybut not less then 6 months, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 207 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The parameters shall include, if applicable, the optimisation goal selected by the advertiser, information on the use of custom lists and in such case – the category and source of personal data uploaded to the online platform and the legal basis for uploading this personal data pursuant to Regulation (EU) 2016/679, information on the use of lookalike audiences and in such case – relevant information on the seed audience and an explanation why the recipient of the advertisement has been determined to be part of the lookalike audience, meaningful information about the online platform’s algorithms or other tools used to optimise the delivery of the advertisement, including a specification of the optimisation goal and a meaningful explanation of reasons why the online platform has decided that the optimisation goal can be achieved by displaying the advertisement to this recipient. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 215 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression access to and information, the freedom and pluralism of the media, the right to private life, the right to non- discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform, to restrictions on access to content under professional editorial responsibility, or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 224 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms mayshould reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They mayshould also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 235 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. and that those options are used by default
Amendment 255 #
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. In addition in order to ensure effective protection of fundamental rights of EU citizens that take into account diverse national law sand difference in socio-cultural context between countries, a Member State shall exercise jurisdiction where it concerns very large online platforms which offer services to a significant number of recipients in a given Member State. Member States jurisdiction is particularly important in case of very large online platforms which are social media because they play a central role in facilitating the public debate
Amendment 262 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, competition, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 291 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service which is working under strict human oversight, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 301 #
Proposal for a regulation
Article 3 – paragraph 3
Article 3 – paragraph 3
3. This Article shall not affect the possibility for a court or functionally independent administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 330 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item or multiple items of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 356 #
Proposal for a regulation
Article 8 – paragraph 3 a (new)
Article 8 – paragraph 3 a (new)
3 a. The Digital Services Coordinator of each Member State, on its own initiative, within 72 hours of receiving the copy of the order to act, has the right to scrutinise the order to determine whether it seriously or manifestly infringes the respective Member State’s law and revoke the order on its own territory.
Amendment 390 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact, including postal address, and ensure that that information is up to date. Providers of intermediary services shall notify that information, including the name, postal address, the electronic mail address and telephone number, of their single point of contact, to the Digital Service Coordinator in the Member State where they are established.
Amendment 392 #
Proposal for a regulation
Article 11 – paragraph 4
Article 11 – paragraph 4
4. Providers of intermediary services shall notify valid identification data, including the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date.
Amendment 394 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
Article 11 – paragraph 5 a (new)
5 a. Very large online platform defined in art. 25, at the request of the Digital Services Coordinator of the Member States where this provider offers its services, shall designate a legal representative to be bound to obligations laid down in this article
Amendment 398 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear plain, intelligible and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 407 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2 a. Very large online platforms as defined in article 25, should publish their terms and conditions in all official languages of the Union.
Amendment 408 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
Amendment 410 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2 c. Notwithstanding the right in article 12(3), the Digital Services Coordinator of each Member State, by means of national legislation, may seek to request from a very large online platform to cooperate with the Digital Services Coordinator of the Member State in question in handling specific legal content removal cases in which there is reason to believe that Member State’s socio-cultural context may have played a vital role.
Amendment 413 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Article 12 a Any restrictions referred to in paragraph 1 must respect fundamental rights enshrined imn the Charter
Amendment 415 #
Proposal for a regulation
Article 12 b (new)
Article 12 b (new)
Article 12 b Providers of intermediary services shall notify at least 30 days in advance their users of any changes to terms and conditions or algorithmic changes
Amendment 420 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action;
Amendment 423 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 428 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
Article 13 – paragraph 2 a (new)
2 a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific templates of reports specified in paragraph 1.
Amendment 440 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, and applicable additional information enabling the identification of the illegal content which shall be appropriate to the type of content and to the specific type of intermediary;
Amendment 466 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service,engages in any content moderation irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of prior to enforcing the decision and provide a clear and specific statement of reasons for that decision. This obligation shall not apply to content incitement to violence and child sexual abuse.
Amendment 486 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which do not engage in illegal activity.
Amendment 495 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(c a) any other decisions that affect the availability, visibility or accessibility of that content and the recipient’s account or the recipient’s access to significant features of the platform’s regular services
Amendment 505 #
Proposal for a regulation
Article 17 – paragraph 4
Article 17 – paragraph 4
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. This feedback shall also include: - information on whether the decision referred to in paragraph 1 was taken as a result of human review or through automated means - in case the decision referred to in paragraph 1 is tobe sustained, detailed explanation on how the information to which the complaint relates is in breach of the platform’s terms and conditions or why the online platform finds the information unlawful.
Amendment 508 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. Complainants shall have the right to request human review and consultation with relevant online platforms’ staff with respect to content to which the complaint relates to.
Amendment 511 #
Proposal for a regulation
Article 17 – paragraph 5 a (new)
Article 17 – paragraph 5 a (new)
5 a. Recipients of the service negatively affected by the decision of an online platform shall have the possibility to seek swift judicial redress in accordance with the laws of the Member States concerned. The procedure shall ensure that an independent judicial body decides on the matter without undue delay, resolving the case no later than within 14 days while granting then negatively affected party the right to seek interim measures to be imposed within 48 hours since the recourse is brought before this body. The right to seek a judicial redress and interim measures will not be limited or conditioned on exhausting the internal complaint-handling system.
Amendment 528 #
Proposal for a regulation
Article 18 – paragraph 6 – point 1 (new)
Article 18 – paragraph 6 – point 1 (new)
(1) Members State shall establishe a mechanism enabling the recipients of the service to content decision of out of court dispute settlement bodies before a national judicial authority relevant for resolving disputes related to freedom of expression
Amendment 530 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6 a. Member States shall establish a mechanism enabling the recipients of the service to contest decisions of out-of-court dispute settlement bodies before a national judicial authority or an administrative authority relevant for resolving disputes related to freedom of expression
Amendment 604 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
Article 24 – paragraph 1 – point c
(c) meaningful information about the mainall parameters used to determine the recipient to whom the advertisement is displayed.
Amendment 612 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Article 24 a 2. Online platforms shall present personalised advertising only on the basis of data explicitly provided to them or declared by recipients of services and provided that they have granted consent for the use of this data for the purposes of delivering personalised advertising
Amendment 613 #
Proposal for a regulation
Article 24 b (new)
Article 24 b (new)
Article 24 b 3. Online platforms that use algorithms to deliver advertisements shall set out in their terms and conditions relevant information on the functioning of these algorithms including main criteria used by the algorithm, categories and sources of input data.
Amendment 710 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in or any otheir systerms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used used to determine the order of presentation of content, including their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameat which decrease the visibility of content, shall set out in their terms that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in these systems
Amendment 715 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1 a. 2.The main parameters referred to in paragraph1 shall include, at minimum: (a) the main criteria used by the relevant recommender system, (b) how these criteria are weighted against each other, (c)the optimisation goal of the relevant recommender system, (d) explanation of the role that the behaviour of the recipients of the service plays in how the relevant recommender system functions.
Amendment 717 #
Proposal for a regulation
Article 29 – paragraph 1 b (new)
Article 29 – paragraph 1 b (new)
1 b. 3. Very large online platforms shall provide options for the recipients of the service to modify or influence parameters referred to in paragraph 2, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679
Amendment 720 #
Proposal for a regulation
Article 29 – paragraph 2
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, vVery large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service: a) to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them, b) to select third party recommender systems.
Amendment 775 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shall encouragehave the right to request and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
Amendment 777 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may inviteshall request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 796 #
Proposal for a regulation
Article 40 – paragraph 3 a (new)
Article 40 – paragraph 3 a (new)
3 a. 4: Member States shall exercise jurisdiction for the purposes of Chapters III and IV of this Regulation where it concerns very large online platforms, as defined in art. 25, which offer services to a significant number of active recipients of the service in a given Member State, which can be calculated on the basis of art. 23(2).
Amendment 805 #
Proposal for a regulation
Article 43 – paragraph 1
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Assessment of the complaint can be supplemented by the opinion of Digital Services Coordinator of the Member State, where the recipient resides or is established, on how the matter should be resolved taking into account national law and socio-cultural context of a given Member State. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
Amendment 809 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Article 43 – paragraph 1 a (new)
Pursuant to paragraph 1 the Digital Services Coordinator of establishment in cases concerning complaint transmitted by the Digital Services Coordinator of the Member State where the recipient resides or is established, should assess the matter in a timely manner and should inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled.
Amendment 813 #
Proposal for a regulation
Article 45 – paragraph 1 a (new)
Article 45 – paragraph 1 a (new)
Amendment 814 #
Proposal for a regulation
Article 45 – paragraph 2 a (new)
Article 45 – paragraph 2 a (new)
2 a. A recommendation pursuant to paragraph 1 and 2 may additionally indicate: a) an opinion on matters that involve taking into account national law and socio-cultural context; b) a draft decision based on investigation pursuant to paragraph1a
Amendment 818 #
Proposal for a regulation
Article 45 – paragraph 7
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information should be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1.
Amendment 834 #
Proposal for a regulation
Article 48 – paragraph 6
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure, following the consent of and inform the Commission thereof.
Amendment 836 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
Article 49 – paragraph 1 – point d
(d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measuradopt opinions on issues concerning very large online platforms in accordance with this Regulation;
Amendment 837 #
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
Article 49 – paragraph 1 – point e a (new)
(e a) (f) issue opinions, recommendations or advice on matters related to Article 34.
Amendment 856 #
Proposal for a regulation
Article 52 – paragraph 1
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, their legal representatives, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period.
Amendment 908 #
Proposal for a regulation
Article 73 – paragraph 1
Article 73 – paragraph 1
1. By fivthree years after the entry into force of this Regulation at the latest, and every fivthree years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation.
Amendment 909 #
Proposal for a regulation
Article 73 – paragraph 4
Article 73 – paragraph 4