Activities of Patrick BREYER related to 2020/2018(INL)
Opinions (1)
OPINION with recommendations to the Commission on Digital Services Act: Improving the functioning of the Single Market
Shadow opinions (1)
OPINION with recommendations to the Commission on Digital Services Act: Improving the functioning of the Single Market
Amendments (24)
Amendment 6 #
Draft opinion
Paragraph 1
Paragraph 1
1. Underlines that digital services and their underlying algorithms need to fully respect fundamental rights, especially privacy, the protection of privacy and personal data, non-discrimination and the freedom of speechexpression and information, as enshrined in the Treaties and the Charter of Fundamental rights of the European Union; calls therefore on the Commission to implement an obligation of transparency and explainability of algorithms, penalties to enforce such obligations, and the possibility of human intervention, as well as other measures, such as independent audits and specific stress tests to assist and enforce compliance;
Amendment 8 #
Draft opinion
Paragraph 1
Paragraph 1
1. Underlines that digital services and their underlying algorithms need to fully respect fundamental rights, especially privacy and the protection of privacy and personal data, non-discrimination and the freedom of speechexpression and information, as enshrined in the Treaties and the Charter of Fundamental rights of the European Union; calls therefore on the Commission to implement an obligation of transparency and explainability of algorithms, and the possibility of human intervention, as well as other measures, such as independent audits and specific stress tests to assist and enforce compliance;
Amendment 9 #
Draft opinion
Paragraph 1 – indent 1 (new)
Paragraph 1 – indent 1 (new)
- such independent audits should be conducted annually, in analogy with the financial sector, to examine whether the used data policy, algorithms and checks and balances are in accordance with specified criteria and are supervised by an independent sufficient overseeing authority;
Amendment 10 #
Draft opinion
Paragraph 1 – subparagraph 1 (new)
Paragraph 1 – subparagraph 1 (new)
Notes that platforms use automated decision making algorithms to disseminate and order the content shown to the users, including to organise their personal feed; stresses that these algorithms, how they work and how they order the shown material, are a black box to users, which takes away choice and control from the user, enables the creation of echo chambers and leads to a distrust in digital services; calls on the Commission to compel digital services, in its DSA proposal, to offer content by default in a chronological order, and to increase user control to influence the content they see by default;
Amendment 22 #
Draft opinion
Paragraph 2
Paragraph 2
2. Emphasises that the rapid development of digital services requires a strong legislation to protect privacy and a reasonable duty of care to ensure digital dignitfutureproof legislative framework to protect personal data and privacy; stresses therefore in this regard that all digital services need to fully respect Union data protection law, namely Regulation (EU) 2016/679 of the European Parliament and of the Council (GDPR)1 and Directive (EC) 2002/58 of the European Parliament and of the Council (ePrivacy)2 , currently under revision, and the freedom of expression and non-discrimination; _________________ 1 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1). 2 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (OJ L 201, 31.7.2002, p. 37).
Amendment 23 #
Draft opinion
Paragraph 2
Paragraph 2
2. Emphasises that the rapid development of digital services requires strong legislation to protect privacy and a reasonable duty of carepersonal data to ensure digital dignity; stresses therefore in this regard that all digital services need to fully respect Union data protection law, namely Regulation (EU) 2016/679 of the European Parliament and of the Council (GDPR)1 and Directive (EC) 2002/58 of the European Parliament and of the Council (ePrivacy)2 , currently under revision, and the freedom of expression; _________________ 1Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1). 2 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (OJ L 201, 31.7.2002, p. 37).
Amendment 26 #
Draft opinion
Paragraph 2 a (new)
Paragraph 2 a (new)
2 a. Notes that since the online activities of an individual allow for deep insights into their personality and make it possible to manipulate them, the general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy; confirms that users have a right not to be subject to pervasive tracking when using digital services; stresses that in the spirit of the jurisprudence on communications metadata, public authorities shall be given access to a user’s subscriber and metadata only to investigate suspects of serious crime with prior judicial authorisation;
Amendment 28 #
Draft opinion
Paragraph 2 a (new)
Paragraph 2 a (new)
2 a. Stresses the importance of clear obligations with regard to content moderation; underlines for this reason the principle of what is illegal offline is illegal online and the prohibition of general monitoring; asks the Commission to let these principles prevail in the DSA- proposal;
Amendment 32 #
Draft opinion
Paragraph 2 b (new)
Paragraph 2 b (new)
2 b. Stresses that in order to overcome the lock-in effect of centralised networks and to ensure competition and consumer choice, users of dominant social media services and messaging services shall be given a right to cross-platform interaction via open interfaces (interconnectivity); highlights that these users shall be able to interact with users of alternative services, and that the users of alternative services shall be allowed to interact with them;
Amendment 37 #
Draft opinion
Paragraph 2 c (new)
Paragraph 2 c (new)
2 c. Stresses that, in order to protect the freedom of expression and information, it is crucial to maintain the limited liability regime and the prohibition of general monitoring obligations for intermediaries; notes that automated tools are unable to differentiate illegal content from content that is legal in a given context; highlights that a review of automated reports by service providers, their staff or their contractors does not solve this problem as private staff lack the independence, qualification and accountability of public authorities; therefore stresses that the Digital Services Act shall explicitly prohibit any obligation on hosting service providers or other technical intermediaries to use automated tools for content moderation; content moderation procedures used by providers shall not lead to any ex-ante control measures based on automated tools or upload-filtering of content; urges the adoption of rules on transparent notice- and-action mechanisms that provide for adequate safeguards and possibilities to seek effective remedies against decisions removing legal online content; stresses that independent public authorities should be ultimately responsible to determine whether online content is legal or not;
Amendment 41 #
Draft opinion
Paragraph 2 d (new)
Paragraph 2 d (new)
2 d. Emphasizes that no notice-and- stay-down-mechanisms should be imposed as they rely on algorithms that cannot assess the legality of content;
Amendment 42 #
Draft opinion
Paragraph 2 e (new)
Paragraph 2 e (new)
2 e. Highlights that, in order to constructively build upon the rules of the e-Commerce Directive and to ensure legal certainty, applicable legislation shall exhaustively and explicitly spell out the duties of digital service providers rather than imposing a general duty of care; highlights that the legal regime for digital providers liability should not depend on uncertain notions such as the ‘active’ or ‘passive’ role of providers;
Amendment 44 #
Draft opinion
Paragraph 2 f (new)
Paragraph 2 f (new)
2 f. Stresses that the responsibility for enforcing the law, deciding on the legality of online activities and ordering hosting service providers to remove or disable access to content as soon as possible shall rest with independent judicial authorities; only a hosting service provider that has actual knowledge of illegal content and its illegal nature shall be subject to content removal obligations;
Amendment 48 #
Draft opinion
Paragraph 3
Paragraph 3
3. RecommendsCalls on the Commission to work on harmonising the national personal identification sign-ins with a view to creating a single Union sign-in system in order to ensure the protection of personal data and age verification, especially for childrenreflect on the need and the technical possibilities for age verification systems; stresses that such systems must be transparent, secure, and anonymised in order to prevent data misuse and user tracking;
Amendment 50 #
Draft opinion
Paragraph 3 a (new)
Paragraph 3 a (new)
3 a. Stresses that in line with the principle of data minimisation established by the General Data Protection Regulation, the Digital Services Act shall require intermediaries to enable the anonymous use of their services and payment for them wherever it is technically possible, as anonymity effectively prevents unauthorized disclosure, identity theft and other forms of abuse of personal data collected online; only where existing legislation requires businesses to communicate their identity, providers of major market places could be obliged to verify their identity, while in other cases the right to use digital services anonymously shall be upheld;
Amendment 54 #
Draft opinion
Paragraph 4
Paragraph 4
4. Points out that biometric data is considered to be a special category of personal data with specific rules for processing; notes that biometrics can and are used for identification and authentication of individuals, which entails significant risks to and interferences with the right to privacy and data protection, as well as enabling identity fraud; calls on the Commission to incorporate in its Digital Services Act an obligation to always give users of digital services an alternative for using biometrical data set by default for the functioning of a service, and an obligation to clearly inform the customers on the risks of using biometric data; stresses that a digital service may not be refused where the individual refuses to use biometric data;
Amendment 62 #
5. Notes the potential negative impact of micro-targeted advertisingpersonalised advertising, in particular micro-targeted and behavioural advertisements, and of assessments of individuals, especially on minors and other vulnerable groups, by interfering in the private life of individuals, posing questions as to the collection and use of the data used to target saidpersonalise advertising, offering products or services or setting prices; cCalls therefore on the Commission to introduce a limitation on micro-targeted advertisements, especially on vulnerable groupphase-out prohibition on personalised advertisements, starting with minors, and a prohibition on the use of discriminatory practices for the provision of services or products.;
Amendment 63 #
Draft opinion
Paragraph 5
Paragraph 5
5. Notes the potential negative impact of micro-targeted advertising and of assessment of individuals, especially on minors and other vulnerable groups, by interfering in the private life of individuals, posing questions as to the collection and use of the data used to target said advertising, offering products or services or setting prices; calls therefore on the Commission to introduce a limitation on micro-targeted advertisements, especially on vulnerable groups, and a prohibition onreconfirms that the ePrivacy Directive makes targeted advertising subject to an opt-in decision and that it is otherwise prohibited, calls on the Commission to prohibit the use of discriminatory practices for the provision of services or products.
Amendment 72 #
Draft opinion
Paragraph 5 a (new)
Paragraph 5 a (new)
5 a. Calls on the Commission to consider obliging major hosting service providers to report serious crime to the competent law enforcement authority, upon obtaining actual knowledge of such a crime;
Amendment 77 #
Draft opinion
Paragraph 5 b (new)
Paragraph 5 b (new)
Amendment 78 #
Draft opinion
Paragraph 5 c (new)
Paragraph 5 c (new)
5 c. Highlights that, in order to protect freedom of speech standards, to avoid conflicts of laws, to avert unjustified and ineffective geo-blocking and to aim for a harmonised digital single market, hosting service providers shall not be required to remove or disable access to information that is legal in their country of origin;
Amendment 79 #
Draft opinion
Paragraph 5 c (new)
Paragraph 5 c (new)
5 c. Notes that the e-Commerce Directive dates back to 2000; notices that the data protection regime is significantly updated since then; recalls therefore that any future provision of the e-Commerce Directive fully respects the European regime on data protection;
Amendment 82 #
Draft opinion
Paragraph 5 d (new)
Paragraph 5 d (new)
5 d. Points out that the Digital Services Act shall not use the legally undefined concept of “harmful content”, but shall address the publication of content that is unlawful; emphasizes that the spreading of false and racist information on social media should be contained by giving users control over content proposed to them; stresses that curating content on the basis of tracking user actions shall require the user’s consent; proposes that users of social networks should have a right to see their timeline in chronological order; suggests that dominant platforms shall provide users with an interface to have content curated by software or services of their choice;
Amendment 83 #
Draft opinion
Paragraph 5 e (new)
Paragraph 5 e (new)
5 e. Stresses the need to apply effective end-to-end encryption to data, because it is essential for trust in and security on the Internet; realises that effectively preventing unauthorized third party access necessarily prevents access by providers and authorities as well; takes the view that as digital services permeate every aspect of our lives, society, and critical infrastructure, securing this infrastructure and safeguarding fundamental rights by applying effective encryption takes priority;