Activities of Sylvie GUILLAUME related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (continuation of debate)
Amendments (36)
Amendment 127 #
Proposal for a regulation
Recital 2 a (new)
Recital 2 a (new)
(2a) States shall also commit to promoting, by means of multilateral agreements along the lines of the International Partnership for Information and Democracy signed by 21 Member States of the EU, regulation of the public information and communication space by establishing democratic guarantees for the digital space which are based on the responsibility of platforms and guarantees of the reliability of information. These multilateral commitments will provide common solutions for issues falling within the scope of this Regulation.
Amendment 199 #
Proposal for a regulation
Recital 46 a (new)
Recital 46 a (new)
(46a) It will also be possible to better guarantee that service providers respect freedom of expression and information if the trusted flaggers can also bring or support an action against a content notice or a moderation operation, such an action being assessed as a priority. The trusted flaggers can submit their cases to platforms' internal systems for handling complaints, take action via extra-judicial dispute settlement mechanisms and ultimately go through the courts to contest a decision by the platforms.
Amendment 360 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. In addition, recipients of intermediary services should be able to hold the legal representative liable for non-compliance.
Amendment 404 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. The general conditions for intermediary service providers are based on the essential principles of human rights as enshrined in the Charter and in international law, in particular Article 19 of the International Covenant on Civil and Political Rights, as interpreted by general comment No 34 of the UN's Human Rights Committee.
Amendment 447 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
(50a) After having obtained the necessary contact information of a trader, which are aimed at ensuring consumer rights, a provider of intermediary services needs to verify that these details are consistently being updated and accessible for consumers. Therefore, it shall conduct regular and randomized checks on the information provided by the traders on its platform. To ensure a consistent display of these contact information, intermediary services should establish mandatory designs for the inclusion of these contact information. A content, good or service shall only be displayed after all necessary information are made available by the business user.
Amendment 461 #
Proposal for a regulation
Recital 52 a (new)
Recital 52 a (new)
(52a) The market position of very large online platforms allows them to collect and combine enormous amounts of personal data, thereby strengthening their market position vis-a-vis smaller competitors, while at the same time incentivising other online platforms to take part in comparable data collection practices and thus creating an unfavourable environment for consumers. Therefore, the collecting and further processing of personal data for the purpose of displaying tailored advertisement should be prohibited. The selection of advertisements shown to a consumer should consequently be based on contextual information, such as language settings by the device of the user or the digital location. Besides a positive effect on privacy and data protection rights of users, the ban will increase competition on the market and will facilitate market access for smaller online platforms and privacy-friendly business models.
Amendment 508 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65a) Due to their market position, very large online platforms have developed an increasing influence over society’s social, economic, and political interactions. Consumers face a lock-in situation, which may lead them into accepting unfavourable terms and conditions to participate in the services provided by these very large online platforms. To restore a competitive market and to allow consumers more choices, very large online platforms should be required to setup the necessary technical access points to create interoperability for their core services, with a view to allowing competitors a fairer market access and enabling more choice for consumers, while at the same time complying with privacy, security and safety standards. These access points should create interoperability for other online platform services of the same type, without the need to convert digital content or services to ensure functionality.
Amendment 536 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, as well as actions brought by trusted flaggers or requests by a trusted flagger to support an action brought using the mechanisms referred to in Article 14, or against a decision to remove content or delete an account, are processed and decided upon with priority and without delay.
Amendment 536 #
Proposal for a regulation
Recital 73 a (new)
Recital 73 a (new)
(73a) The designation of a Digital Services Coordinator in the Member Stat should be without prejudice to already existing enforcement mechanisms, such as in electronical communication or media regulation, and independent regulatory structures in these fields as defined by European and national law. The competences of the Digital Services Coordinator should not interfere with those of the appointed authorities. For ensuring coordination and for contributing to the effective consistent application and enforcement of this Regulation throughout the Union, the different European networks, in particular the European Regulators Group for Audiovisual Media Services (ERGA) and the Body of European Regulators for Electronic Communications (BEREC), should be responsible. For the effective implementation of this task, these networks should develop suitable procedures to be applied in cases concerning this Regulation.
Amendment 593 #
Proposal for a regulation
Article 23 a (new)
Article 23 a (new)
Amendment 606 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including a high level of consumer protection, are effectively protected.
Amendment 617 #
Proposal for a regulation
Article 25 a (new)
Article 25 a (new)
Article 25a Neutrality requirement Very large online platforms shall be subject to a political, ideological and religious neutrality requirement and may not display bias in terms of opinions, ideas or political parties.
Amendment 624 #
Proposal for a regulation
Article 1 – paragraph 5 – introductory part
Article 1 – paragraph 5 – introductory part
5. This Regulation is without prejudice toshall and will not affect the rules laid down by the following:
Amendment 643 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
Article 1 – paragraph 5 a (new)
5a. This Regulation shall not affect the possibility of Member States to adopt new legislation as well as to take regulatory measures, especially with regard to intermediary service providers that serve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media and opinion or of cultural and linguistic diversity.
Amendment 645 #
Proposal for a regulation
Article 1 a (new)
Article 1 a (new)
Article 1a No circumvention of the rules set out in this Regulation 1. Any contractual provision between an intermediary service provider and a recipient of its service, between an intermediary service provider and a trader or between a recipient of its service and a trader, which is contrary to this Regulation, is invalid. 2. This Regulation shall apply irrespective of the law applicable to contracts.
Amendment 682 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or thewith a law of a Member State where it is in conformity with Union law, irrespective of the precise subject matter or nature of that law;
Amendment 713 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1a. Very large online platforms which use recommender systems shall be obliged to promote the reliability of information (due prominence obligation) by putting in place mechanisms referring to a standard of self-regulation and aimed at highlighting information sources which comply with standardised professional and ethical standards of self-regulation and giving them preferential treatment in terms of content prioritisation.
Amendment 767 #
Proposal for a regulation
Article 33 – paragraph 2 – point d a (new)
Article 33 – paragraph 2 – point d a (new)
(da) an audit report on the algorithms used for referencing, personalising and content moderation.
Amendment 808 #
Proposal for a regulation
Article 43 – paragraph 1
Article 43 – paragraph 1
Recipients of the service and trusted flaggers shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
Amendment 907 #
Proposal for a regulation
Article 66 a (new)
Article 66 a (new)
Amendment 928 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in theiruse fair, non-discriminatory and transparent contract terms and conditions. T that information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set outshall be drafted in clear and unambiguous language and shall bare publicly available in an easily accessible format in a searchable archive of all the previous versions with their date of application.
Amendment 940 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to national and Union law, the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service, in particular the freedom of expression and information, as enshrined in the Charter.
Amendment 1015 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 1019 #
Proposal for a regulation
Article 13 b (new)
Article 13 b (new)
Amendment 1078 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Where an online platform that allows consumers to conclude distance contracts with traders, detects and identifies illegal goods or services, it shall be obliged to establish an internal database of those goods and services that had previously been taken down by the online platform because they had been found to be illegal or harmful. They shall, under the inclusion of elements listed in the Rapid Exchange of Information System (RAPEX) and other relevant public databases, scan their database on a daily basis to detect illegal goods and services. If this process detects a good or service that has previously been found to be illegal or harmful, the online platform shall be obliged to delete the content expeditiously.
Amendment 1079 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Where the explanation of the reasons as referred to in paragraph 2 (a) does not allow a diligent economic operator to identify the illegality of the content in question; where the notified content is not illegal in the country of establishment of the hosting service; or, where there is a genuine demonstrable doubt about the illegality of the content, the hosting services may seek assistance for further clarification with the relevant authority or the national Digital Services Coordinator;
Amendment 1132 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 1145 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the provider of the online platform not to act upon the receipt of a notice or against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 1200 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the bodyalways direct recipients to an out-of-court dispute settlement body. The information about the competent out-of-court body shall be easily accessible on the online interface of the online platform in a clear and an user-friendly manner.
Amendment 1208 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
Article 18 – paragraph 1 a (new)
1a. Online platforms shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 1362 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
Article 21 – paragraph 2 a (new)
2a. When a platform that allows consumers to conclude distance contracts with traders becomes aware that a piece of information, a product or service poses a serious risk to the life, health or safety of consumers, it shall promptly inform the competent authorities of the Member State or Member States concerned and provide all relevant information available.
Amendment 1547 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union and shall submit a report of that risk assessment to the national competent authority of the Member State in which their legal representative is established. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 1712 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, functionable and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until onfive year after the advertisement was displayed for the last time on their online interfaces. They shall ensure multi- criterion queries can be performed per advertiser and per all data points present in the advertisement, and provide aggregated data for these queries on the amount spent, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1771 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1804 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33a Algorithm accountability 1. When using automated decision- making, the very large online platform shall perform an assessment of the algorithms used. 2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: (a) the compliance with corresponding Union requirements; (b) how the algorithm is used and its impact on the provision of the service; (c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and (d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). 3. When performing its assessment, the very large online platform may seek advice from relevant national public authorities, researchers and non- governmental organisations. 4. Following the assessment, referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points a), b), c) and d) of Paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. 5. Where the very large online platform finds that the algorithm used does not comply with point (a), or (d) of paragraph 2 of this Article, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. 6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c), or (d) of paragraph 2 of this Article, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred into Paragraph 5 of this Article, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement.
Amendment 1919 #
Proposal for a regulation
Article 38 a (new)
Article 38 a (new)
Article 38a Relation to sector-specific provisions The application of these provisions does not affect areas that are subject to sector- specific regulation and provisions. In these areas, the responsibility for enforcing the provisions lies with the competent national authorities, which are organised in European networks. Within these networks, the competent authorities shall establish suitable procedures that allow for effective coordination and consistent application and enforcement of this Regulation.