Activities of Dace MELBĀRDE related to 2020/0361(COD)
Shadow opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Amendments (51)
Amendment 177 #
Proposal for a regulation
Recital 36 a (new)
Recital 36 a (new)
(36 a) Very large online platforms should provide for the possibility to communicate to their points of contact in every of the official languages of the Member States where they provide services. Other providers of intermediary services should ensure that the choice of language does not impose a disproportionate burden on Member States' authorities and make every effort to establish effective communication options. Recourse to a possible language barrier should not be used to ignore or deny communication with a Member States' authorities and should not be used as an excuse for inaction. Where necessary, Member States' authorities and providers of intermediary services may reach a separate agreement on the language of communication.
Amendment 196 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. Such internal systems should be available also to individuals or entities that have submitted a notice. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint- handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 201 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content and against activities incompatible with the platform’s terms and conditions can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content or activities incompatible with the platform’s terms and conditions, that they represent public or collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 228 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding democracy and public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 230 #
Proposal for a regulation
Recital 59
Recital 59
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, relevant regulatory authorities, representatives of groups potentially impacted by their services, independent experts and civil society organisations.
Amendment 236 #
Proposal for a regulation
Recital 63
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should be searchable, easy to access and functional and should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
Amendment 238 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a frameworks for compelling access to data from very large online platforms to vetted researchersthe Digital Services Coordinator and the Commission on the one hand, and, to vetted researchers, on the other hand. All requirements for access to data under thatose frameworks should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 254 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowencouraged to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 268 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other principal service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.
Amendment 269 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system, designed as a separate tool from the principal service offered and used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 296 #
Proposal for a regulation
Article 10 – paragraph 3
Article 10 – paragraph 3
3. Providers of intermediary services shall specify in the information referred to in paragraph 2, the official language or languages of the Union, which can be used to communicate with their points of contact and which shall include at least one of the official languages of the Member State in which the provider of intermediary services has its main establishment or where its legal representative resides or is established. Very large online platforms should provide for the possibility to communicate to their points of contact in every of the official languages of the Member States where they provide services.
Amendment 297 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall includepublish information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, inter alia, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. Providers of intermediary services shall inform of changes to their terms and conditions in a timely manner.
Amendment 314 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the category, including the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
Amendment 323 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content or information that is incompatible with the terms and conditions of the provider. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means.
Amendment 324 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality or incompatibility of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 326 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content or incompatible with the provider’s terms and conditions;
Amendment 330 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) sufficiently precise information enabling the identification of the illegal or incompatible content, where relevant, including a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 352 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 354 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove, restrict or disable access to the information;
Amendment 355 #
(c) decisions to suspend or terminate the recipients’ account.;
Amendment 356 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(c a) decisions not to act upon the receipt of a notice.
Amendment 376 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content or content incompatible with the platform’s terms and conditions;
Amendment 380 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents public or collective interests and is independent from any online platform;
Amendment 405 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed, and who finances the advertisement;
Amendment 424 #
Proposal for a regulation
Article 27 – paragraph 1 – point d a (new)
Article 27 – paragraph 1 – point d a (new)
(d a) initiating or adjusting cooperation with media service providers;
Amendment 444 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a searchable, easy to access and functional repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 445 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
Article 30 – paragraph 2 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed, and who finances the advertisement;
Amendment 446 #
Proposal for a regulation
Article 30 – paragraph 2 – point d
Article 30 – paragraph 2 – point d
(d) whether the advertisement was intended to be displayed specifically to or concealed specifically from one or more particular groups of recipients of the service and if so, the main parameters used for that purpose;
Amendment 447 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.;
Amendment 449 #
Proposal for a regulation
Article 30 – paragraph 2 – point e a (new)
Article 30 – paragraph 2 – point e a (new)
(e a) whether the advertisement has been labelled, moderated, or disabled.
Amendment 452 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, media, civil society or international organisations representing the public interest, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 460 #
Proposal for a regulation
Article 33 – paragraph 1
Article 33 – paragraph 1
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months. The reports shall include information disaggregated by Member State and provide information on the human and technical resources allocated for the purpose of content moderation for each official language of the Union.
Amendment 1027 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content or information that is incompatible with the terms and conditions of the provider. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means.
Amendment 1035 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality or incompatibility of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 1039 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content or incompatible with the provider’s terms and conditions;
Amendment 1046 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal or incompatible content;
Amendment 1144 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the provider of the online platform not to act upon the receipt of a notice or against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 1151 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions whether or not to remove or disable access to or restrict visibility of the information;
Amendment 1160 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
Amendment 1162 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions whether or not to suspend or terminate the recipients’ account.
Amendment 1165 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) decisions whether or not to restrict the ability to monetise content provided by the recipients;
Amendment 1172 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
Article 17 – paragraph 1 – point c b (new)
(cb) decisions whether or not to apply labels or additional information on content.
Amendment 1269 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content or an incompatibility of the content with the platform’s terms and conditions;;
Amendment 1321 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content or that is incompatible with their terms and conditions.
Amendment 1490 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
Amendment 1636 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrenteach of the systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 1713 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a searchable, easy to access and functional repository containing the information referred to in paragraph 2, until onthree years after the advertisement was displayed for the last time on their online interfaces. They shall ensure multi- criterion queries can be performed per advertiser and per all data points present in the advertisement. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed. They shall make sure that if advertisements have been labelled, moderated, or disabled, these labels shall be clearly visible and identifiable for users and researchers.
Amendment 1758 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate, and with an easily accessible and user-friendly mechanism to search for multiple criteria, such as those reported in accordance with the obligations set out in Articles 13, 23 and 33.
Amendment 1763 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, journalists, civil society organisations or international organisations representing the public interest, shall be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 1770 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1796 #
Proposal for a regulation
Article 33 – paragraph 1
Article 33 – paragraph 1
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months. The reports shall include information disaggregated by Member State and clearly stating the human and technical resources allocated for the purpose of content moderation for each official EU language.