Activities of Patrick BREYER related to 2020/2019(INL)
Shadow reports (1)
REPORT with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online
Amendments (51)
Amendment 16 #
Motion for a resolution
Recital A
Recital A
A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that balances central concerns like respect fors fundamental rights and other rights of citizens, with the need to support development and economic progresswhile supporting economic progress and the digital environment;
Amendment 75 #
Motion for a resolution
Recital P
Recital P
P. whereas access to non-personal data is an important factor in the growth of the digital economy; whereas the interoperability of non-personal data can, by removing lock-in effects, play an important part in ensuring that fair market conditions exist;
Amendment 83 #
Motion for a resolution
Paragraph 1
Paragraph 1
1. Requests that the Commission submit without undue delay a set of legislative proposals comprising a Digital Services Act with a widn adequate material, personal and territorial scope, including the recommendations as set out in the Annex to this resolution; considers that, without prejudice to detailed aspects of the future legislative proposals, Article 114 of the Treaty on the Functioning of the European Union should be chosen as the legal basis;
Amendment 100 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. CStresses that the responsibility for enforcing the law shall rest with public authorities; considers that any final decision on the legality of user-generated content must be made by an independent judiciary and not a private commercial entity;
Amendment 136 #
Motion for a resolution
Paragraph 7
Paragraph 7
7. Recommends the establishment of independent dispute settlement bodies tasked with settling disputes regarding content moderation; takes the view that to protect anonymous publications and the general interest, not only the user who generated moderated content but also an independent Ombudsperson and organisations safeguarding freedom of expression should be able to challenge content moderation decisions;
Amendment 145 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and shall refrain from imposing notice-and- stay-down mechanisms, and considers that any such mechanisms voluntarily employed by platforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Acshall not lead to any ex-ante control measures based on automated tools or upload-filtering of content;
Amendment 152 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Considers that the user-targeted amplification of content based on the views or positions presented in such contentpersonal information is one of the most detrimental practices in the digital society, especially in cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements; is concerned that these practices fuel so- called surveillance capitalism and rely on pervasive tracking and data-mining;
Amendment 159 #
Motion for a resolution
Paragraph 10
Paragraph 10
10. Is of the view that the use of targeted advertising must be regulated more strictly in favour of less intrusive forms of advertising that do not require extensividentifiable tracking of user interaction with content;
Amendment 165 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Recommends, therefore, that the Digital Services Act set clear boundaries as regards the termsmakes consent a precondition for accumulation of data for the purpose of targeted advertising, especially when data are tracked on third party websites;
Amendment 166 #
Motion for a resolution
Paragraph 11 a (new)
Paragraph 11 a (new)
11a. Stresses that in line with the principle of data minimisation established by the General Data Protection Regulation, the Digital Services Act shall require intermediaries to enable the anonymous use of their services and payment for them wherever it is technically possible, as anonymity effectively prevents unauthorized disclosure, identity theft and other forms of abuse of personal data collected online; only where existing legislation requires businesses to communicate their identity, providers of major market places could be obliged to verify their identity, while in other cases the right to use digital services anonymously shall be upheld;
Amendment 168 #
Motion for a resolution
Paragraph 11 b (new)
Paragraph 11 b (new)
11b. Notes that since the online activities of an individual allow for deep insights into their personality and make it possible to manipulate them, the general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy; confirms that users have a right not to be subject to pervasive tracking when using digital services; stresses that in the spirit of the jurisprudence on communications metadata, public authorities shall be given access to a user’s subscriber and metadata only to investigate suspects of serious crime with prior judicial authorisation;
Amendment 169 #
Motion for a resolution
Paragraph 11 c (new)
Paragraph 11 c (new)
11c. Recommends that providers which support a single sign-on service with a dominant market share should be required to also support at least one open and federated identity system based on a non-proprietary framework;
Amendment 173 #
Motion for a resolution
Paragraph 12
Paragraph 12
12. Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate datathe sharing of non- personal data with the aim of addressing imbalances in market power; suggests, to this end, to explore options to facilitate the interoperability and portability of datausers of dominant social media services and messaging services be given a right to cross-platform interaction via open interfaces (interconnectivity); highlights that these users shall be able to interact with users of alternative services, and that the users of alternative services shall be allowed to interact with them;
Amendment 180 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Calls for content hosting platforms to give users thfree choice of whether or not to consent to the use of targeted advertising based on the user’s prior interaction with content on the same content hosting platform or on third party websites; reconfirms that the ePrivacy Directive makes targeted advertising subject to an opt-in decision and that it is otherwise prohibited;
Amendment 183 #
Motion for a resolution
Paragraph 14
Paragraph 14
14. Further calls for users to be guaranteed an appropriate degree of influence over the criteria according to which content is curated and made visible for them; affirms that this should also include the option to opt out from any content curation; suggests that dominant platforms shall provide users with an interface to have content curated by software or services of their choice;
Amendment 203 #
Motion for a resolution
Subheading 3
Subheading 3
Provisions regarding terms and conditions, smart contracts and blockchains
Amendment 204 #
Motion for a resolution
Paragraph 17 a (new)
Paragraph 17 a (new)
17a. Underlines that the fairness and compliance with fundamental rights standards of terms and conditions imposed by intermediaries to the users of their services shall be subject to judicial review. Terms and conditions unduly restricting users’ fundamental rights, such as the right to privacy and to freedom of expression, shall not be binding;
Amendment 215 #
Motion for a resolution
Paragraph 19
Paragraph 19
19. Considers that non-negotiablestandard contractual terms and conditions should neither prevent effective access to justice in Union courts nor disenfranchise Union citizens or businesses and that the status of access rights to data under private international law is uncertain and leads to disadvantages for Union citizens and businesses;
Amendment 220 #
Motion for a resolution
Paragraph 21 a (new)
Paragraph 21 a (new)
21a. Stresses that service providers shall not be required to remove or disable access to information that is legal in their country of origin;
Amendment 221 #
Motion for a resolution
Subheading 4 a (new)
Subheading 4 a (new)
Addressing illegal activities
Amendment 222 #
Motion for a resolution
Paragraph 21 b (new)
Paragraph 21 b (new)
21b. Highlights that, in order to constructively supplement the rules of the e-Commerce Directive and to ensure legal certainty, applicable legislation shall exhaustively and explicitly spell out the duties of digital service providers rather than imposing a general duty of care; highlights that the legal regime for digital providers liability should not depend on uncertain notions such as the ‘active’ or ‘passive’ role of providers;
Amendment 223 #
Motion for a resolution
Paragraph 21 d (new)
Paragraph 21 d (new)
21d. Stresses that the responsibility for enforcing the law, deciding on the legality of online activities and ordering hosting service providers to remove or disable access to content as soon as possible shall rest with independent judicial authorities; only a hosting service provider that has actual knowledge of illegal content and is aware beyond doubt of its illegal nature shall be subject to content removal obligations;
Amendment 224 #
Motion for a resolution
Paragraph 21 e (new)
Paragraph 21 e (new)
21e. Underlines that illegal content should be removed where it is hosted, and that access providers shall not be required to block access to content;
Amendment 225 #
Motion for a resolution
Paragraph 21 f (new)
Paragraph 21 f (new)
21f. Stresses that proportionate sanctions should be applied to violations of the law, which shall not encompass excluding individuals from digital services;
Amendment 268 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1
Annex I – part A – part I – section 2 – indent 1
- regular auditing of the algorithms employed by content hosting platforms for the purpose of content moderation as well as curation;
Amendment 285 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 1
Annex I – part A – part I – section 2 – indent 4 – subi. 1
- failure to implement the notice-and- action system as provided for in the Regulation;
Amendment 291 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 2
Annex I – part A – part I – section 2 – indent 4 – subi. 2
- failure to provide fair, transparent, accessible and non-discriminatory terms and conditions;
Amendment 298 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 3
Annex I – part A – part I – section 2 – indent 4 – subi. 3
- failure to provide access for the European Agency to content moderation and curation algorithms for review;
Amendment 323 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 1
Annex I – part A – part II – section 1 – indent 1
- Measures to limit the data collected by content hosting platforms, based on interactions of users with content hosted on content hosting platforms, for the purpose of completing targeted advertising profiles, in particular by imposing strict conditions for the use of targeted personal advertisements and making the collection of personal data subject to user consent.
Amendment 336 #
Motion for a resolution
Annex I – part A – part II – section 2 – introductory part
Annex I – part A – part II – section 2 – introductory part
The path to fair implementation of the rights of users as regards interoperabilconnectivity and portability should include:
Amendment 338 #
Motion for a resolution
Annex I – part A – part II – section 2 – indent 1
Annex I – part A – part II – section 2 – indent 1
- an assessment of the possibility of defining fair contractual conditions to facilitate non-personal data sharing with the aim of addressing imbalances in market power, in particular through the interoperability and portability of data.
Amendment 342 #
Motion for a resolution
Annex I – part A – part II – section 2 – indent 1 a (new)
Annex I – part A – part II – section 2 – indent 1 a (new)
- users of dominant social media services and messaging services shall be given a right to cross-platform interaction via open interfaces (interconnectivity); users shall be able to interact with users of alternative services, and that the users of alternative services shall be allowed to interact with them.
Amendment 355 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 1
Annex I – part A – part II – section 4 – indent 1
- include measures ensuring that non-negotiablestandard contractual terms and conditions do not include provisions regulating private international law matters to the detriment of access to justice,
Amendment 366 #
Motion for a resolution
Annex I – part B – recital 5
Annex I – part B – recital 5
(5) Concerning relations with users, this Regulation should lay down minimum standards for thefairness, transparency and accountability of terms and conditions of content hosting platforms. Terms and conditions should include fair, transparent, binding and uniform standards and procedures for content moderation, which should guarantee accessible and independent recourse to judicial redress and comply with human rights standards.
Amendment 370 #
Motion for a resolution
Annex I – part B – recital 6
Annex I – part B – recital 6
(6) User-targeted amplification of content based on the views in such contentpersonal user information is one of the most detrimental practices in the digital society, especially when such content is amplified on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements.
Amendment 373 #
Motion for a resolution
Annex I – part B – recital 7
Annex I – part B – recital 7
(7) In order to ensure, inter alia, that users can assert their rights they should be given an appropriate degree of influence over the curation of content made visible to them, including the possibility to opt out of any content curation altogether. In particular, users should not be subject to curation without specific consent. Dominant platforms should provide users with an interface to have content curated by software or services of their choice.
Amendment 378 #
Motion for a resolution
Annex I – part B – recital 9
Annex I – part B – recital 9
(9) This Regulation should not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante control of content, refrain from imposing notice-and-stay- down mechanisms and provide that content moderation procedures used voluntarily by providers shall not lead to any ex-ante control measures based on automated tools or upload-filtering of content.
Amendment 387 #
Motion for a resolution
Annex I – part B – recital 11
Annex I – part B – recital 11
(11) The right to issue a notice pursuant to this Regulation should remain with any natural or legal person, including public bodies, to which content is provided through a website or application. A content hosting platform should, however, be able to block a user who repeatedly issues false notices from issuing further notices.
Amendment 390 #
Motion for a resolution
Annex I – part B – recital 13
Annex I – part B – recital 13
(13) All concerned parties should be informed about a decision as regards a notice. The information provided to concerned parties should also include, apart from the outcome of the decision, at least the reason for the decision and whether the decision was taken by a human, as well as relevant information regarding review or redress.
Amendment 410 #
Motion for a resolution
Annex I – part B – Article 2 – paragraph 1
Annex I – part B – Article 2 – paragraph 1
This Regulation applies to the management by content hosting platforms of content that is accessible to the public on websites or through smart phone applications in the Union, irrespective of the place of establishment or registration, or principal place of business of the content hosting platform. It shall not apply to non- commercial content hosting platforms and platforms with less than 100,000 users.
Amendment 414 #
Motion for a resolution
Annex I – part B – Article 3 –point 1
Annex I – part B – Article 3 –point 1
(1) ‘content hosting platform’ means an information society service within the meaning of point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council1 of which the main or one of the main purposes is to allow signed-up or non-signed-up users to upload content for display on a publicly accessible website or application; __________________ 1 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
Amendment 423 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 1
Annex I – part B – Article 4 – paragraph 1
1. Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall comply with relevant principles of human rights law and be appropriate, relevant and limited to what is necessary in relation to the purposes for which the content is managed. Content hosting platforms with 100,000 users or more shall conduct assessments of the direct and indirect human rights impact of their current and future content management practices on users and affected parties, and ensure appropriate follow-up to these assessments, including monitoring and evaluating the effectiveness of identified responses.
Amendment 438 #
Motion for a resolution
Annex I – part B – Article 5 – subparagraph 1
Annex I – part B – Article 5 – subparagraph 1
Any natural or legal person or public body to which content is provided through a website or application shall have the right to issue a notice pursuant to this Regulation with or without providing personal data.
Amendment 440 #
Motion for a resolution
Annex I – part B – Article 5 – subparagraph 2
Annex I – part B – Article 5 – subparagraph 2
Amendment 448 #
Motion for a resolution
Annex I – part B – Article 8 –introductory part
Annex I – part B – Article 8 –introductory part
Upon a notice being issued, and before any decision on the content has been made, the uploader of the content in question shall receive the following information and be heard where the uploader has chosen to identify and provide contact details:
Amendment 453 #
Motion for a resolution
Annex I – part B – Article 9 – point 1
Annex I – part B – Article 9 – point 1
1. Content hosting platforms shall ensure that decisions on notifications are taken by qualified staff without undue delay following the necessary investigations.
Amendment 458 #
Motion for a resolution
Annex I – part B – Article 9 – point 2
Annex I – part B – Article 9 – point 2
2. Following a notice, content hosting platforms shall decide to remove, take down or make invdisiable access to content that was the subject of a notice, if such content without any doubt does not comply with legal and regulatory requirements, community guidelines or terms and conditionrequirements.
Amendment 462 #
Motion for a resolution
Annex I – part B – Article 10 – point b
Annex I – part B – Article 10 – point b
Amendment 466 #
Motion for a resolution
Annex I – part B – Article 11
Annex I – part B – Article 11
Amendment 481 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 1
Annex I – part B – Article 14 – paragraph 1
1. The uploader as well as non-profit entities with a legitimate interest in defending freedom of expression and information shall have the right to refer a case of content moderation to the competent independent dispute settlement body where the content hosting platform has decided to remove or take down content or otherwise to act in a manner that is contrary to the action preferred by the uploader as expressed by the uploader or interferes with freedom of expression and information.
Amendment 487 #
Motion for a resolution
Annex I – part B – Article 16 a (new)
Annex I – part B – Article 16 a (new)
Article 16a Sanctions Member States shall provide for penalties where a person acting for purposes relating to their trade, business, craft or profession systematically and repeatedly submits wrongful notices. Such penalties shall be effective, proportionate and dissuasive.