Activities of Patrizia TOIA related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (continuation of debate)
Amendments (57)
Amendment 98 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and, automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider or where the provider of the service promotes and optimises the content.
Amendment 99 #
Proposal for a regulation
Recital 18 a (new)
Recital 18 a (new)
(18 a) The exemptions from liability should also not be available to providers of intermediary services that do not comply with the due diligence obligations set out in this Regulation. The conditionality should further ensure that the standards to qualify for those exemptions contribute to a high level of safety and trust in the online environment in a manner that promotes a fair balance of the rights of all stakeholders.
Amendment 103 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. Where the illegal content can cause significant public harm, the provider should assess and, when necessary, remove or disable access to that content as expeditiously as possible, in any case not more than one hour. If necessary, during the period between the notice and the eventual removal or disabling, the provider should also reduce the visibility of such a content online so as to minimize the risks. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 120 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact- finding obligation, or as a general obligation forimpeding upon the ability of providers to undertake proactive measures to relation to illegal contentidentify and remove illegal content and to prevent its reappearance.
Amendment 130 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. Providers of intermediary services that qualify as small or micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to stablish collective representation under the guidance of the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative.
Amendment 131 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easy to access and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider shall assess the illegality of the identified content and, based on that assessment, can decide whether or not it agrees with that assessme notification for illegal content and wishes to remove or disable access to that content ('action'). In the event that the provider of hosting services assesses the notice of illegal content to be positive and thus decides to remove or disable access to it, it shall ensure that such content remains inaccessible after take down. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 137 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should prevent the reappearance of the notified illegal information. The provider should also inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 140 #
Proposal for a regulation
Recital 43
Recital 43
Amendment 144 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which must ensure human review and meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint- handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner and within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 148 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content and are known to flag content frequently with a high rate of accuracy, that they represent collective interests and that they work in a diligent, objective and objeffective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry representing collective interests and of right- holders specifically created for that purpose could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions and ensure independent public interest representation. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 168 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising withat can have both an impact on the equal treatment and opportunities of citizens and on the perpetuation of harmful stereotypes and norms. Therefore, more transparency in online advertising markets and independent research needs to be carried out to assess the effectiveness of behavioural advertisements which could pave the way for stricter measures or restriction of behavioural advertising. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 171 #
Proposal for a regulation
Recital 52 a (new)
Recital 52 a (new)
(52 a) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
Amendment 186 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly and separately present the main parameters for such recommender systems in an clear, concise, accessible and easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient, and shall not make the recipients of their services subject to recommender systems based on profiling by default.
Amendment 188 #
Proposal for a regulation
Recital 63
Recital 63
Amendment 213 #
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
Article 1 – paragraph 2 – point b a (new)
(b a) promote innovation and facilitate competition for digital services, while protecting users and consumers rights.
Amendment 214 #
Proposal for a regulation
Article 1 – paragraph 2 – point b b (new)
Article 1 – paragraph 2 – point b b (new)
(b b) stimulate the level playing field of the online ecosystem by introducing interoperability requirements for very large platforms.
Amendment 218 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(i a) Charter of Fundamental Rights of the European Union
Amendment 221 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
Article 2 – paragraph 1 – point d – introductory part
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union;, or in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as: where the provider targets its activities towards one or more Member States.
Amendment 222 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
Article 2 – paragraph 1 – point d – indent 1
Amendment 223 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
Article 2 – paragraph 1 – point d – indent 2
Amendment 232 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation and govern themselves under specific terms and conditions.
Amendment 240 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, rank and prioritise information in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 262 #
Proposal for a regulation
Article 5 – paragraph 4 a (new)
Article 5 – paragraph 4 a (new)
4 a. Providers of intermediary services shall be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 when they do not comply with the due diligence obligations set out in this Regulation.
Amendment 267 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Article 6 – paragraph 1 a (new)
Providers of intermediary services shall ensure that voluntary investigations or activities are accompanied with appropriate safeguards, such as human oversight, to ensure they are transparent, fair and non-discriminatory.
Amendment 268 #
Proposal for a regulation
Article 7 – title
Article 7 – title
No general monitoring or active fact- finding obligations without undermining the obligation to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk
Amendment 284 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
Article 11 – paragraph 5 a (new)
5 a. Providers of intermediary services that qualify as small or micro enterprises within the meaning of the Annex to Recommendation2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to stablish collective representation under the guidance of the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative.
Amendment 317 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 328 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 339 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. In case of decisions to remove or disable access to the notified information are taken, they shall extend to preventing the reappearance thereof.
Amendment 345 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. Providers of hosting services shall ensure that content previously identified as illegal following the mechanisms in paragraphs 1 and 2, remain inaccessible after take down.
Amendment 403 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competencdemonstrated particular competence, accuracy and expertise for the purposes of detecting, identifying and notifying illegal content;
Amendment 407 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests, ensures independent public interest representation and is independent from any online platform;
Amendment 409 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent andn objective manner.
Amendment 410 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. Digital Services Coordinators shall engage in dialogue with platforms and rights holders for maintaining the accuracy and efficacy of a trusted flagger system.
Amendment 418 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestlyillegal content. A termination of the service can be issued in case the recipients fail to comply with the applicable provisions set out in this Regulation or in case the suspension has occurred at least 3 times following verification of the repeated provision of illegal content.
Amendment 487 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
Article 24 – paragraph 1 – point a
(a) that the information displayed is anor parts thereof is an online advertisement;
Amendment 489 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
Amendment 503 #
Proposal for a regulation
Article 24 – paragraph 1 c (new)
Article 24 – paragraph 1 c (new)
Online platforms shall offer the possibility to easily opt-out for micro-targeted tracking.
Amendment 504 #
Proposal for a regulation
Article 24 – paragraph 1 d (new)
Article 24 – paragraph 1 d (new)
Online platforms shall offer the possibility to opt-in for the use of behavioural data and political advertising.
Amendment 552 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored toeasures to cease, prevent and mitigate the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 554 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services and activities, or their terms and conditions;
Amendment 564 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to cease, prevent and mitigate the systemic risks identified.
Amendment 592 #
Proposal for a regulation
Article 29 – title
Article 29 – title
Recommender and reputation systems
Amendment 598 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large oOnline platforms that use recommender systems shall set out in their terms and conditions, in a clearseparately the information concerning the role and functioning of recommender systems, in a clear for average users, concise, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as anyoffer controls with the available options for the recipients of the service to modifyin a user-friendly manner to modify, customize or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. basic natural criteria such as time, topics of interest, etc.
Amendment 601 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1 a. The parameters referred to in paragraph 1 shall include, at a minimum: (a) whether the recommender system is an automated system and, in that case, the identity of the natural or legal person responsible for the recommender system, if different from the platform provider; (b) clear information about the criteria used by recommender systems; (c) the relevance and weight of each criteria which leads to the information recommended; (e) what goals the relevant system has been optimised for, (d) if applicable, explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs.
Amendment 609 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2 a. Online platforms that use reputation systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters through which information is collected, processed and published as reviews.
Amendment 611 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
Article 29 – paragraph 2 b (new)
2 b. The reputation systems shall also comply with the following criteria: a) a review must be based on a genuine experience of the users and indicate the submission date; b) a review must be published without undue delay; c) the order or relative prominence in which reviews are presented by default must not be misleading; d) in case the platform is aware of the fact that the author of a review has received any benefit for giving the review a specific positive or negative content, it needs to ensure that no such review remains published in its service; e) in case of a rejection or removal of a review, the author thereof must be informed about the reasons without undue delay; f) if the reputation system displays reviews for a fixed period of time only, the duration of this period must be indicated to platform users. This period must be reasonable, but not shorter than 12 months; g) the platform operator must provide free-of-charge mechanisms which allow platform users to submit a notification of any abuse or to submit a response, which must be published together with that review without undue delay.
Amendment 613 #
Proposal for a regulation
Article 29 – paragraph 2 c (new)
Article 29 – paragraph 2 c (new)
Amendment 619 #
Proposal for a regulation
Article 30 – paragraph 2 – point d a (new)
Article 30 – paragraph 2 – point d a (new)
(d a) whether one or more particular groups of recipients of the service have been explicitly excluded from the advertisement target group;
Amendment 627 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide information and access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conductingfacilitating and conducting public interest research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). and to enable verification of the effectiveness and proportionality of the mitigation measures as set out in Article 27(1).
Amendment 631 #
Proposal for a regulation
Article 31 – paragraph 3 a (new)
Article 31 – paragraph 3 a (new)
3 a. Very large online platforms shall provide effective portability of data generated through the activity of a business user or end user and shall, in particular, provide tools for end users to facilitate the exercise of data portability, in line with Regulation EU 2016/679, including by the provision of continuous and real-time access;
Amendment 633 #
Proposal for a regulation
Article 31 – paragraph 3 c (new)
Article 31 – paragraph 3 c (new)
3 c. The data provided to vetted researchers shall be as disaggregated as possible, unless the researcher requests it otherwise.
Amendment 681 #
Proposal for a regulation
Article 42 – paragraph 3
Article 42 – paragraph 3
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or global turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or global turnover of the provider concerned.
Amendment 684 #
Proposal for a regulation
Article 42 – paragraph 4
Article 42 – paragraph 4
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily global turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.
Amendment 704 #
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
Article 59 – paragraph 1 – introductory part
1. In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total global turnover in the preceding financial year where it finds that thate platform, intentionally or negligently:
Amendment 705 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
Article 59 – paragraph 2 – introductory part
2. The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total global turnover in the preceding financial year, where they intentionally or negligently:
Amendment 706 #
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
Article 60 – paragraph 1 – introductory part
1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily global turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to: