10 Amendments of Ibán GARCÍA DEL BLANCO related to 2020/2018(INL)
Amendment 3 #
Draft opinion
Recital A a (new)
Recital A a (new)
Aa. Whereas the aim of a review of the DSA is to update the civil and commercial laws governing responsibility for online platforms and hosting service providers to provide certainty and safety for companies, users and society as a whole.
Amendment 11 #
Draft opinion
Paragraph 1
Paragraph 1
1. Stresses that wherever it is technically and legally possible and reasonable, intermediaries should be required to enable the anonymous use of their services and payment for them, since anonymity effectively prevents unauthorised data disclosure and identity theft; notes that where the Directive on Consumer RightsUnion law requires commercial traders to communicate their identity, providers of majoronline market places could be obliged to verify their identity of the traders acting in a professional or non-professional capacity, while in other cases the right to use digital services anonymously shcould be upheld;
Amendment 19 #
Draft opinion
Paragraph 2
Paragraph 2
2. Calls on platform operators not only to immediately delete illegal content after positive identification, but also to continuously transmit it to the law enforcemcompetent authorities for the purpose of further prosecution, including the metadata necessary for this purpose,;
Amendment 26 #
Draft opinion
Paragraph 2
Paragraph 2
2. Notes that since the online activities of individuals allow for deep insights into their personality and make it possible to manipulate them, the collection and use of personal data notably by systemic platforms and social networks concerning the use of digital services, should be subjected to a specific privacy framework and limited to the extent necessary to provide and bill the use of the service;
Amendment 29 #
Draft opinion
Paragraph 3
Paragraph 3
3. Insists that the protection and promotion of freedom and diversity of opinion, information, the press and cultural forms of expression, as well as the protection of the privacy of communication between individuals, form the basis of liberal democracy and that this applies online without restriction; demands therefore that the use of all technologically feasible means of combating harmful or illegal content on the internet in this context be subjected to careful prior constitutional vetting and therefore rejects prior checks on content as disproportionateis also aware of the possible conflict between these freedoms and other fundamental rights, such as security and property - including intellectual property - and of the possibly irreparable consequences of failure to ensure prior control of certain content; demands therefore that the use of all technologically feasible means of combating harmful or illegal content on the internet in this context be subjected to rigorous standards that are known to users in advance; stresses the need to notify users in real time of the activation of such controls and the existence of an effective and adaptable appeals procedure, firstly with the private supplier and then with an administrative authority;
Amendment 31 #
Draft opinion
Paragraph 3
Paragraph 3
3. Notes that automated tools are unable to differentiate illegal content from content that is legal in a given context; highlights that human review of automated reports by service providers does not solve this problem as private staff lack the independence, qualification and accountability of public authorities; stresses, therefore,could improve the functioning of content recognition technologies to differentiate illegal content from content that is legal where it is legally provided and could enhance existing practices if use in combination with human review and an increased transparency on their functioning; underlines that to ensure the well-being and independence of the personnel, the staff should receive proper training and adequate psychological support to develop skills that help them and to discern on the need to collaborate with public authorities where needed; stresses that the Digital Services Act should explicitly prohibit any obligation on hosting service providers or other technical intermediaries to use automated tools for content moderation, and refrain from imposing notice-and- stay-down mechanismsunless otherwise specified in existing legal text; insists that content moderation procedures used by providers should not lead to any ex-ante control measures based on automated tools or upload-filtering of general monitoring of uploaded content;
Amendment 41 #
4. Stresses that the final responsibility for enforcing the law, deciding on the legality of online activities and ordering hosting service providersshould always rest with independent judicial and other competent public authorities; hosting service providers has a duty of care to ensure a safe online environment for users and therefore their responsibility to remove or disable access to illegal content as soon as possible should rest with independent judicial authoritiesbe based on existing legislation and jurisprudence; considers that only a hosting service provider that has actual knowledge of illegal content and its illegal nature, through a valid notice, should be subject to content removal obligations and to ensure that the same illegal content cannot be re-uploaded;
Amendment 54 #
Draft opinion
Paragraph 4 a (new)
Paragraph 4 a (new)
4a. Stresses that the Digital Services Act should be applied without prejudice to the provisions of the Directive on copyright and related rights in the Digital Single Market (Copyright Directive); notes that an 'online content-sharing service provider' as defined in Article 2(6) of the Copyright Directive has responsibilities regarding copyright- protected material and that the safe harbour provisions of the E-Commerce Directive or any similar provisions of the Digital Services Act possibly applicable to these platforms do not apply in such cases;
Amendment 60 #
Draft opinion
Paragraph 5
Paragraph 5
5. Emphasises that the spread of disinformation such as false and racist information on social media should be contained, also by giving users control over content proposed to them; stresses that curating content on the basis of tracking user actions should require the user’s consent; proposes that users of social networks should have a right to see their timeline in chronological order; suggests that dominant platforms should provide users with an API to have content curated by software or services of their choice;
Amendment 70 #
Draft opinion
Paragraph 6 a (new)
Paragraph 6 a (new)
6a. Stresses that online marketplaces should be forbidden from imposing a locked ecosystem for the use of digital products sold on their services; a high level of interoperability should be ensured by providing these products in a format that do not limit the use of the product exclusively to a certain proprietary digital ecosystem.