Activities of Daniel DALTON related to 2018/0331(COD)
Reports (1)
REPORT on the proposal for a regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online PDF (577 KB) DOC (263 KB)
Shadow opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online
Amendments (42)
Amendment 104 #
Proposal for a regulation
Recital 3
Recital 3
(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services, whilst taking into account the fundamental importance of the freedom of expression and information in an open and democratic society.
Amendment 117 #
Proposal for a regulation
Recital 7
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. MAny measures constitutingshould avoid interference in the freedom of expression and information should be strictly targeted, in the sense that they mustand in so far as possible should serve to prevent the dissemination of terrorist content through a strictly targeted approach, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
Amendment 121 #
Proposal for a regulation
Recital 8
Recital 8
(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibilityin the context of this Regulation, the possibility for users to contest the removal of content resulting from measures taken by the hosting service provider as foreseen in this Regulation and to be informed of effective means of remedies. It also includes the right for hosting service providers and content providers to effectively contest the removal orders, imposed proactive measures or penalties, before the court of the Member State whose authorities issued the removal order, imposed proactive measures or penalties, or the court where the hosting service provider is established or represented.
Amendment 127 #
Proposal for a regulation
Recital 9
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and the competent authoritiesy should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content. The right to such expression can be invoked before the court of the Member State where the hosting service provider has its main establishment or where the legal representative designated by the hosting service provider pursuant to this Regulation resides or is established. _________________ 9 Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
Amendment 134 #
Proposal for a regulation
Recital 10
Recital 10
(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored directly available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive naturee public. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services, excluding cloud IT infrastructure service providers, to the extent they make the information directly available to third parties and websites where users can make comments or post reviewse public. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.
Amendment 143 #
Proposal for a regulation
Recital 13
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that taskfreely designate a single competent authority with that task, unless their constitutional arrangements prevent a single authority from being responsible. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.
Amendment 146 #
Proposal for a regulation
Recital 14
Recital 14
(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticityestablish the authenticity of the order without unreasonable financial or other burden on the service provider, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 . _________________ 12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).
Amendment 157 #
Proposal for a regulation
Recital 18
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are necessary, effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness, necessity and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union), as well as the safeguards put in place to protect freedom of expression and information and the number of incidents of restrictions on legal content.
Amendment 158 #
Proposal for a regulation
Recital 19
Recital 19
(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. The competent authority should only impose proactive measures that the hosting service provider can reasonably be expected to implement, taking into account, among other factors, the hosting service provider's financial and other resources. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could, in exceptional circumstances, derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.
Amendment 162 #
Proposal for a regulation
Recital 24
Recital 24
(24) Transparency of hosting service providers' policies in relation to terrorist content is essential to enhance their accountability towards their users and to reinforce trust of citizens in the Digital Single Market. Hosting service providers should publish annual transparency reports containing meaningful information about action taken in relation to the detection, identification and removal of terrorist content, as well as the number of restrictions on legal content. Competent Authorities should also publish annual transparency reports containing meaningful information on the number of legal orders issued, the number of removals, the number of identified and detected terrorist content removed and the number of restrictions on legal content.
Amendment 166 #
Proposal for a regulation
Recital 25
Recital 25
(25) Complaint procedures constitute a necessary safeguard against erroneous removal of content protected under the freedom of expression and information. Hosting service providers should therefore establish user-friendly complaint mechanisms and ensure that complaints are dealt with promptly and in full transparency towards the content provider, and this should include information on all effective remedy options, including judicial redress routes. The requirement for the hosting service provider to reinstate the content where it has been removed in error, does not affect the possibility of hosting service providers to enforce their own terms and conditions on other grounds.
Amendment 168 #
Proposal for a regulation
Recital 26
Recital 26
(26) Effective legal protection according to Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union requires that persons are able to ascertain the reasons upon which the content uploaded by them has been removed or access to it disabled. For that purpose, the hosting service provider should make available to the content provider meaningful information enabling the content provider to contest the decision. However, this does not necessarily require a notification to the content provider. Depending on the circumstances, hosting service providers may replace content which is considered terrorist content, with a message that it has been removed or disabled in accordance with this Regulation. Further information about the reasons as well as possibilities for the content provider to contest the decision should be given upon request. Where competent authorities decide that for reasons of public security including in the context of an investigation, it is considered inappropriate or counter-productive to directly notify the content provider of the removal or disabling of content, they should inform the hosting service provider.
Amendment 181 #
Proposal for a regulation
Recital 39
Recital 39
(39) The use of standardised templates facilitates cooperation and the exchange of information between competent authorities and service providers, allowing them to communicate more quickly and effectively. It is particularly important to ensure swift action following the receipt of a removal order, depending on the size and means of the hosting service provider. Templates reduce translation costs and contribute to a high quality standard. Response forms similarly should allow for a standardised exchange of information, and this will be particularly important where service providers are unable to comply. Authenticated submission channels can guarantee the authenticity of the removal order, including the accuracy of the date and the time of sending and receipt of the order.
Amendment 185 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down uniform rules to prevent and address the misuse of hosting services for the dissemination of terrorist content online. It lays down in particular:
Amendment 188 #
Proposal for a regulation
Article 1 – paragraph 1 – point b
Article 1 – paragraph 1 – point b
(b) a set of measures to be put in place by Member States to identify terrorist content, to enable its swift removal by hosting service providers in accordance with Union law providing suitable safeguards for freedom of expression and information and to facilitate cooperation with the competent authorities in other Member States, hosting service providers and where appropriate relevant Union bodies.
Amendment 197 #
Proposal for a regulation
Article 1 – paragraph 2 a (new)
Article 1 – paragraph 2 a (new)
2 a. This Regulation shall not have the effect of undermining the obligation to respect fundamental rights and fundamental legal principles as enshrined in Article 6 of the Treaty on the European Union.
Amendment 204 #
Proposal for a regulation
Article 2 – paragraph 1 – point 1
Article 2 – paragraph 1 – point 1
(1) 'hosting service provider' means a provider of information society services consisting in the storage of information provided by and at the request of the content provider and in making the information stored available to third partiese public;
Amendment 237 #
Proposal for a regulation
Article 2 – paragraph 1 – point 6
Article 2 – paragraph 1 – point 6
(6) ‘dissemination of terrorist content’ means making terrorist content available to third partiese public on the hosting service providers’ services;
Amendment 242 #
Proposal for a regulation
Article 3 – paragraph 1
Article 3 – paragraph 1
1. Hosting service providers shall take appropriate, reasonable and proportionate actions in accordance with this Regulation, against the dissemination of terrorist content and to protect users from terrorist content. In doing so, they shall act in a diligent, proportionate and non- discriminatory manner, and with due regard in all circumstances to the fundamental rights of the users and take into account the fundamental importance of the freedom of expression and information in an open and democratic society. In particular those actions shall not amount to general monitoring.
Amendment 263 #
Proposal for a regulation
Article 1 – paragraph 1 – point a
Article 1 – paragraph 1 – point a
(a) rules on duties of care to be applied by hosting service providers in order to prevent the public dissemination of terrorist content through their services and ensure, where necessary, its swift removal;
Amendment 267 #
Proposal for a regulation
Article 4 – paragraph 3 – point g
Article 4 – paragraph 3 – point g
(g) where relevantnecessary and appropriate, the decision not to disclose information about the removal of terrorist content or the disabling of access to it referred to in Article 11.
Amendment 268 #
Proposal for a regulation
Article 4 – paragraph 3 – point g a (new)
Article 4 – paragraph 3 – point g a (new)
(g a) deadlines for appeal for the hosting service provider and for the content provider.
Amendment 290 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
1. Hosting service providers shallmay, where appropriate, in particular where there is a non-incidental level of exposure to terrorist content and receipt of removal orders, take proactive measures to protect their services against the dissemination of terrorist content. The measures shall be effective, targeted and proportionate, taking into accounto the risk and level of exposure to terrorist content, paying particular regard to the fundamental rights of the users, and the fundamental importance of the freedom of expression and information in an open and democratic society.
Amendment 306 #
Proposal for a regulation
Article 6 – paragraph 3
Article 6 – paragraph 3
3. Where the competent authority referred to in Article 17(1)(c) considers that the proactive measures taken and reported under paragraph 2 do not respect the principles of necessity and proportionality or are insufficient in mitigating and managing the risk and level of exposure, it may request the hosting service provider to re-evaluate the measures needed or to take specific additional proactive measures. For that purpose, the hosting service provider shall cooperate with the competent authority referred to in Article 17(1)(c) with a view to identifying the changes or specific measures that the hosting service provider shall put in place, establishing key objectives and benchmarks as well as timelines for their implementation.
Amendment 310 #
Proposal for a regulation
Article 6 – paragraph 4
Article 6 – paragraph 4
4. Where no agreement can be reached within the three months from the request pursuant to paragraph 3, the competent authority referred to in Article 17(1)(c) may issue a decision imposing specific additional necessary and proportionate proactive measures. The competent authority shall not impose a general monitoring obligation. The decision shall take into account, in particular, the economic capacity of the hosting service provider and the effect of such measures on the fundamental rights of the users and the fundamental importance of the freedom of expression and information. Such a decision shall be sent to the main establishment of the hosting service provider or to the legal representative designated by the service provider. The hosting service provider shall regularly report on the implementation of such measures as specified by the competent authority referred to in Article 17(1)(c).
Amendment 324 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Hosting service providers shall set out in their terms and conditions their policy to prevent the dissemination of terrorist content, including, where appropriate, a meaningful explanation of the functioning of proactive measures including particular on the use of automated tools.
Amendment 327 #
Proposal for a regulation
Article 8 – paragraph 2
Article 8 – paragraph 2
2. Hosting service providers and the authorities competent to issue removal orders shall publish annual transparency reports on action taken against the dissemination of terrorist content.
Amendment 333 #
Proposal for a regulation
Article 8 – paragraph 3 – point d
Article 8 – paragraph 3 – point d
(d) overview and outcome of complaint procedures, including the number of cases in which it was established that content was wrongly identified as terrorist content.
Amendment 340 #
Proposal for a regulation
Article 9 – paragraph 2
Article 9 – paragraph 2
2. Safeguards shall consist, in particular, of human oversight and verifications wherof the appropriate and, in any event,ness of the decision to remove or deny access to content, in particular with regard to the right to freedom of expression and information. Human oversight shall be required where a detailed assessment of the relevant context is required in order to determine whether or not the content is to be considered terrorist content.
Amendment 344 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. Hosting service providers shall promptly examine every complaint that they receive and reinstate the content without undue delay where the removal or disabling of access was unjustified. They shall inform the complainant about the outcome of the examination within two weeks from the receipt of the complaint.
Amendment 346 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
Article 10 – paragraph 2 a (new)
2 a. Notwithstanding the provisions of paragraphs 1 and 2, the complaint mechanism of hosting service providers shall be complementary to the applicable laws and procedures of the Member States.
Amendment 348 #
Proposal for a regulation
Article 11 – paragraph 2
Article 11 – paragraph 2
2. Upon request of the content provider, the hosting service provider shall inform the content provider about the reasons for the removal or disabling of access, including the legal basis for this action, and possibilities to contest the decision.
Amendment 349 #
Proposal for a regulation
Article 11 – paragraph 3 a (new)
Article 11 – paragraph 3 a (new)
3 a. An appeal as referred to in Article 4 (9) shall be lodged with the court of the Member State where the hosting provider has its main establishment or where the legal representative designated by the hosting service provider pursuant to Article 16 resides or is established.
Amendment 366 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Each Member State shall designate the authority or authorities competent to
Amendment 380 #
Proposal for a regulation
Article 4 – paragraph 1
Article 4 – paragraph 1
1. The competent authority shall have the power to issue a decisionExcept in cases where a hosting service provider has already been in receipt of a prior removal order, the competent authority must communicate with the hosting service provider before issuing a removal order requiring the hosting service provider to remove terrorist content or disable access to it.
Amendment 391 #
Proposal for a regulation
Article 18 – paragraph 4 a (new)
Article 18 – paragraph 4 a (new)
4 a. Competent authorities shall consider unintentional delays, in particular by small and medium sized businesses and start ups, to be mitigating factors when determining the types and level of penalties.
Amendment 394 #
Proposal for a regulation
Article 23 – paragraph 1
Article 23 – paragraph 1
No sooner than [three years from the date of application of this Regulation], the Commission shall carry out an evaluation of this Regulation and submit a report to the European Parliament and to the Council on the application of this Regulation including the functioning of the effectiveness of the safeguard mechanisms. The report shall also cover the impact of this Regulation on freedom of expression and information. Where appropriate, the report shall be accompanied by legislative proposals. Member States shall provide the Commission with the information necessary for the preparation of the report.
Amendment 419 #
Proposal for a regulation
Article 4 – paragraph 4
Article 4 – paragraph 4
4. Upon request by the hosting service provider or by the content provider, the competent authority shall provide a detailed statement of reasons, including the reasons why the content must be removed within the deadline set out in paragraph 2, this shall be without prejudice to the obligation of the hosting service provider to comply with the removal order within the deadline set out in paragraph 2.
Amendment 442 #
Proposal for a regulation
Article 4 a (new)
Article 4 a (new)
Article 4 a Consultation procedure for removal orders 1. The issuing authority shall submit a copy of the removal order to the competent authority referred to in Article 17(1)(a) of the Member State in which the main establishment of the hosting service provider is located at the same time it is transmitted to the hosting service provider in accordance with Article 4(5). 2. In cases where the competent authority of the Member State in which the main establishment of the hosting service provider is located has reasonable grounds to believe that the removal order may impact fundamental interests of that Member State, it shall inform the issuing competent authority. 3. The issuing authority shall take these circumstances into account and shall, where necessary, withdraw or adapt the removal order.
Amendment 473 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – introductory part
Article 6 – paragraph 2 – subparagraph 1 – introductory part
Where it has been informed according to Article 4(9) and after establishing that a hosting service provider has received a non-incidental number of final removal orders, the competent authority referred to in Article 17(1)(c) shall request the hosting service provider to submit a report, within three months after receipt of the request and thereafter at least on an annual basis, on the specific proactive measures it has taken, including by using automated tools, with a view to:
Amendment 518 #
Proposal for a regulation
Article 6 – paragraph 5
Article 6 – paragraph 5
5. A hosting service provider may, at any time, request the competent authority referred to in Article 17(1)(c) a review and, where appropriate, to revoke a request or decision pursuant to paragraphs 2, 3, and 4 respectively. The competent authority shall provide a reasoned decision within a reasonable period of time after receiving the request by the hosting service provider. Decisions taken pursuant to Article 6(4) shall, at the request of the hosting service provider, be subject to review by a competent national court.
Amendment 681 #
Proposal for a regulation
Article 18 – paragraph 1 – introductory part
Article 18 – paragraph 1 – introductory part
1. Member States shall lay down the rules on penalties applicable to systematic and ongoing breaches of the obligations by hosting service providers under this Regulation and shall take all necessary measures to ensure that they are implemented. Such penalties shall be limited to infringement of the obligations pursuant to: