European Union
International Regulations
European Union General Data Protection Regulation (EU GDPR)
Regulation (EU) 2016/679, of the European Parliament and the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation), 2016 O.J. (L 119) 1.
Highlights
Territorial Scope:
The EU GDPR applies to the processing of Personal Data in the context of the activities of an establishment of a Controller or a Processor in the EU, regardless of whether the processing takes place in the EU or not.
The EU GDPR applies to the processing of Personal Data of Data Subjects who are in the EU by a Controller or Processor not established in the EU, where the processing activities are related to:
- the offering of goods or services, irrespective of whether a payment of the Data Subject is required, to such Data Subjects in the EU; or
- the monitoring of their behavior as far as their behavior takes place within the EU.
Principles:
- Lawfulness, fairness, and transparency
- Purpose limitation
- Data minimization
- Accuracy
- Storage limitation
- Integrity and confidentiality
- Accountability
Lawfulness of Processing:
Processing is lawful only if and to the extent that at least one of the following applies:
- the Data Subject has given consent to the processing of his or her Personal Data for one or more specific purposes;
- processing is necessary for the performance of a contract;
- processing is necessary for compliance with a legal obligation;
- processing is necessary in order to protect the vital interests of the Data Subject or of another natural person;
- processing is necessary for the performance of a task carried out in the public interest; or
- processing is necessary for the purposes of the legitimate interests pursued by the Controller or by a third party.
Controller and Processor Obligations:
- Implement appropriate technical and organizational measures to ensure and to demonstrate that processing is performed in accordance with the EU GDPR.
- Implement appropriate technical and organizational measures, such as pseudonymization, which are designed to implement data protection principles by design and by default.
- For Controllers or Processors not established in the EU, designate in writing an EU representative.
- Govern processing by a Processor by a contract or other legal act under EU or Member State law that is binding on the Processor with regard to the Controller and that sets out the required provisions.
- For Processors, not process Personal Data except on instructions from the Controller, unless required to do so by EU or Member State law.
- Maintain a record of processing activities.
- Cooperate on request with the Supervisory Authority.
- Implement appropriate technical and organizational measures to ensure a level of security appropriate to the risk.
- For Controllers, in the case of a Personal Data Breach, notify the competent Supervisory Authority without undue delay and, where feasible, not later than 72 hours after having become aware of it.
- For Processors, notify the Controller without undue delay after becoming aware of a Personal Data Breach.
- For Controllers, communicate the Personal Data Breach to the Data Subject without undue delay when the Personal Data Breach is likely to result in a high risk to the rights and freedoms of natural persons.
- Carry out a data protection impact assessment where a type of processing is likely to result in a high risk to the rights and freedoms of natural persons.
- Designate a data protection officer where:
- the processing is carried out by a public authority or body, except for courts acting in their judicial capacity;
- the core activities of the Controller or the Processor consist of processing operations which, by virtue of their nature, their scope and/or their purposes, require regular and systematic monitoring of Data Subjects on a large scale; or
- the core activities of the Controller or the Processor consist of processing on a large scale of Special Categories of Personal Data pursuant to Article 9 or Personal Data relating to criminal convictions and offenses referred to in Article 10.
Data Subject Rights:
The Controller shall respond to the Data Subject without undue delay and in any event within 1 month of receipt of the request. That period may be extended by 2 further months where necessary, taking into account the complexity and number of the requests.
- Right of access
- Right to rectification
- Right to erasure (“right to be forgotten”)
- Right to restriction of processing
- Right to data portability
- Right to object to processing of Personal Data
- Right not to be subject to a decision based solely on automated processing, including Profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
Cross-border Data Transfers to Third Countries or International Organizations:
- Transfers on the Basis of an Adequacy Decision: A transfer of Personal Data to a third country or an international organization may take place where the European Commission has decided that the third country, a territory or one or more specified sectors within that third country, or the international organization in question ensures an adequate level of protection.
- Transfers subject to Appropriate Safeguards: In absence of an adequacy decision, a Controller or Processor may transfer Personal Data to a third country or an international organization only if the Controller or Processor has provided appropriate safeguards, and on condition that enforceable Data Subject rights and effective legal remedies for Data Subjects are available. [Example: the EU Standard Contractual Clauses.]
- Binding Corporate Rules: Binding corporate rules are data protection policies adhered to by companies established in the EU for transfers of Personal Data outside the EU within a group of undertakings or enterprises. Companies must submit binding corporate rules for approval to the competent Supervisory Authority in the EU.
- Derogations: In the absence of an adequacy decision and appropriate safeguards, a transfer to a third country or international organization is still possible if it meets the requirements set out by Article 49 of the GDPR, i.e. mainly when they are necessary, and/or occasional and are not repetitive transfers. If that is the case, the Data exporter transferring personal data to third countries or international organizations must meet the conditions of other GDPR provisions, namely Articles 5 and 6.
Personal Data Breach Notification:
- Timeline for Notification to Supervisory Authority: Without undue delay and, where feasible, not later than 72 hours after becoming aware of a Personal Data Breach (unless the Personal Data Breach is unlikely to result in a risk to the rights and freedoms of natural persons). The Processor shall notify the Controller without undue delay after becoming aware of a Personal Data Breach.
- Requirements for Notification to Supervisory Authority: The notification shall at least:
- Describe the nature of the Personal Data Breach including where possible, the categories and approximate number of Data Subjects concerned and the categories and approximate number of Personal Data records concerned;
- Communicate the name and contact details of the data protection officer or other contact point where more information can be obtained;
- Describe the likely consequences of the Personal Data Breach; and
- Describe the measures taken or proposed to be taken by the Controller to address the Personal Data Breach, including, where appropriate, measures to mitigate its possible adverse effects.
- Requirements for Notification to Affected Data Subjects: When the Personal Data Breach is likely to result in a high risk to the rights and freedoms of natural persons, the Controller shall communicate the Personal Data Breach to the Data Subject without undue delay. Such notification should at least contain the information required in (ii)-(iv) above.
- When is Notification to Affected Data Subjects not required?
- If the Controller has implemented appropriate technical and organizational protection measures, and those measures were applied to the Personal Data affected by the Personal Data Breach, in particular those that render the Personal Data unintelligible to any person who is not authorized to access it, such as encryption.
- If the Controller has taken subsequent measures which ensure that the high risk to the rights and freedoms of Data Subjects is no longer likely to materialize.
- If it would involve disproportionate effort. In such a case, there shall instead be a public communication or similar measure whereby the Data Subjects are informed in an equally effective manner.
More Details
Definitions:
- Controller: The natural or legal person, public authority, agency, or other body which, alone or jointly with others, determines the purposes and means of the processing of Personal Data; where the purposes and means of such processing are determined by EU or Member State law, the controller or the specific criteria for its nomination may be provided for by EU or Member State law.
- Data Subject: An identified or identifiable natural person. An identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, culture or social identity of that natural person.
- Personal Data: Any information relating to an identified or identifiable natural person.
- Personal Data Breach: A breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to, Personal Data transmitted, stored, or otherwise processed.
- Processor: A natural or legal person, public authority, agency, or other body which processes Personal Data on behalf of the Controller.
- Profiling: Any form of automated processing of Personal Data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.
- Pseudomysation: Any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.
- Special Categories of Personal Data: Personal Data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation. The EU GDPR also has restrictions on the processing of Personal Data relating to criminal convictions and offenses.
- Supervisory Authority: An independent public authority which is established by a Member State pursuant to Article 51.
Penalties:
Infringements of certain provisions can be subject to administrative fines up to €10,000,000, or in the case of an undertaking, up to 2% of the total worldwide annual turnover of the preceding financial year, whichever is higher.
Infringements of certain, other provisions can be subject to administrative fines up to €20,000,000, or in the case of an undertaking, up to 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher.
Non-compliance with an order by a Supervisory Authority can be subject to administrative fines up to €20,000,000, or in the case of an undertaking, up to 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher.
Remedies, Liability, and Complaints:
- Right to Lodge a Complaint with a Supervisory Authority: Every Data Subject has the right to lodge a complaint with a Supervisory Authority, in particular in the Member State of his or her habitual residence, place of work or place of the alleged infringement if the Data Subject considers that the processing of Personal Data relating to him or her infringes the EU GDPR.
- Right to an Effective Judicial Remedy against a Supervisory Authority: Each natural or legal person has the right to an effective judicial remedy against a legally binding decision of a Supervisory Authority concerning them.
- Right to an Effective Judicial Remedy against a Controller or Processor: Each data subject has the right to an effective judicial remedy where he or she considers that his or her rights under the EU GDPR have been infringed as a result of the processing of his or her Personal Data in non-compliance with the EU GDPR.
Effective Date:
May 25, 2018
NIS 2 Directive
NIS 2 Directive
Directive (EU) 2022/2555 of the European Parliament and of the Council of 14 December 2022 on measures for a high common level of cybersecurity across the Union, amending Regulation (EU) No 910/2014 and Directive (EU) 2018/1972, and repealing Directive (EU) 2016/1148 (NIS 2 Directive).
Highlights
Subject Matter:
- Obligations that require Member States to adopt national cybersecurity strategies and to designate or establish competent authorities, cyber crisis management authorities, single points of contact on cybersecurity (single points of contact) and computer security incident response teams (CSIRTs);
- Cybersecurity risk-management measures and reporting obligations for essential and important entities in sectors of high criticality as well as for entities identified as critical entities under Directive (EU) 2022/2557 (Critical Entities Resilience Directive);
- Rules and obligations on cybersecurity information sharing;
- Supervisory and enforcement obligations on Member States.
Scope:
The NIS 2 Directive applies to public or private entities of a highly critical or critical sector which qualify as medium-sized enterprises (i.e., enterprises which employ fewer than 250 persons and which have an annual turnover not exceeding EUR 50 million, and/or an annual balance sheet total not exceeding EUR 43 million.), or exceed the ceilings for medium-sized enterprises, and which provide their services or carry out their activities within the Union.
Regardless of their size, the NIS 2 also applies to:
- Important and vital entities under highly critical sectors or critical sectors.
- Entities identified as critical entities under Directive (EU) 2022/2557.
- The entity is the sole provider in a Member State of a service which is essential for the maintenance of critical societal or economic activities.
- Disruption of the service provided by the entity could have a significant impact on public safety, public security or public health.
- Disruption of the service provided by the entity could induce a significant systemic risk, in particular for sectors where such disruption could have a cross-border impact.
- The entity is critical because of its specific importance at national or regional level for the particular sector or type of service, or for other interdependent sectors in the Member State.
- The entity is a public administration entity of a central government or at a regional level.
- Entities providing domain name registration services.
- Potentially, public administration entities at local level and education institutions, especially when carrying out critical research activities.
Sectors Involved:
Highly Critical Sectors
- Transport
- Banking
- Financial market infrastructures
- Health
- Waste wate
- Digital infrastructure
- ICT service management (business-ot-business)
- Public administration
- Space
Other Critical Sectors
- Postal and courier services
- Waste management
- Manufacture, production and distribution of chemicals
- Production, processing and distribution of food
- Manufacturing
- Digital providers
- Research
Essential and Important Entities:
Essential Entities
- Entities of highly critical sectors exceeding medium-sized enterprise thresholds (per EU recommendation thresholds).
- Trust service providers, top-level domain registries, and DNS service providers (regardless of size).
- Public communications network/service providers that qualify as medium-sized enterprises.
- Public administration entities of central government as defined by a Member State.
- Entities designated as essential by Member States (of highly critical or critical sectors) and set by a regularly updated list.
- Entities classified as critical under Directive (EU) 2022/2557.
- Entities recognized by Member States prior to 16 Jan 2023 as operators of essential services.
Important Entities:
- Entities of highly critical or critical sectors not qualifying as essential under the above definition.
- Includes entities named by Member States as important.
Coordinated Cybersecurity Framework:
- Each Member State shall adopt a national cybersecurity strategy that provides for the strategic objectives, the resources required to achieve those objectives, and appropriate policy and regulatory measures, with a view to achieving and maintaining a high level of cybersecurity.
- Each Member State shall designate or establish one or more competent authorities responsible for cybersecurity and for the supervisory task.
- Each Member State shall designate or establish one or more competent authorities responsible for the management of large-scale cybersecurity incidents and crises (cyber crisis management authorities).
- A Cooperation Group is established.
- Each Member State shall designate or establish one or more Computer security incident response teams (CSIRTs) responsible for incident handling in accordance with a well-defined process.
- Each Member State shall designate one of its CSIRTs as a coordinator for the purposes of coordinated vulnerability disclosure.
- ENISA (the EU Agency for Cybersecurity) must develop and maintain, after consulting the Cooperation Group, a European vulnerability database.
- In order to support and facilitate strategic cooperation and the exchange of information among Member States, as well as to strengthen trust and confidence, a Cooperation Group is established.
- A network of national CSIRTs is established.
- EU-CyCLONe is established to support the coordinated management of large-scale cybersecurity incidents and crises at operational level and to ensure the regular exchange of relevant information among Member States and Union institutions, bodies, offices and agencies.
- ENISA shall adopt, in cooperation with the Commission and the Cooperation Group, a biennial report on the state of cybersecurity in the Union, including particular policy recommendations.
- The establishment of peer reviews by the ENISA.
Cybersecurity Risk-Management Measures and Reporting Obligations:
Governance:
- Member States must ensure that entity management bodies approve and oversee cybersecurity measures, are held accountable for violations, and undergo regular training—alongside staff—to strengthen risk awareness and cybersecurity proficiency.
Cybersecurity Risk-Managment Measures:
- Essential and important entities must implement proportionate cybersecurity measures—aligned with international standards—to safeguard their systems and services, covering everything from risk analysis and incident response to supply chain security, encryption, and authentication.
- Member States are responsible for ensuring enforcement, supporting corrective actions, considering supplier vulnerabilities, and adopting Commission-led technical standards tailored to key digital service sectors by October 2024.
Union Level Coordinated Security Risk Assessments of Critical Supply Chains:
- The Cooperation Group, together with the Commission and ENISA, may conduct joint risk assessments of critical ICT service and product supply chains, considering both technical and non-technical risks. The Commission is responsible for identifying which ICT services, systems, or products require such coordinated evaluations, in consultation with relevant bodies and stakeholders.
Reporting Obligations for Essential and Important Entities:
- Essential and important entities must, without undue delay, notify national CSIRTs or competent authorities of any significant incidents or threats, especially those with cross-border effects. They will not face increased liability for doing so.
- Significant Incident: An incident is significant if:
- it has caused or is capable of causing severe operational disruption of the services or financial loss for the entity concerned;
- it has affected or is capable of affecting other natural or legal persons by causing considerable material or non-material damage.
- In any event, entities must submit:
- an early warning within 24 hours of becoming aware of the significant incident;
- a detailed incident report within 72 hours; and
- final reporting (or progress updates for ongoing incidents) within one month of discovery, including severity, impact, and threat cause.
- CSIRTs and authorities should provide guidance within 24 hours of early warnings, and share incident data—while respecting confidentiality—with other affected Member States, ENISA, and the public when necessary for safety.
- Single points of contact must forward incident data between Member States, submit quarterly anonymized summaries to ENISA, and ensure entities identified as critical under Directive 2022/2557 report relevant threats.
Use of European Cybersecurity Certification Schemes:
- Member States may require essential and important entities to use particular ICT products, ICT services and ICT processes, developed by the essential or important entity or procured from third parties, that are certified under European cybersecurity certification schemes adopted pursuant to Article 49 of Regulation (EU) 2019/881. Furthermore, Member States shall encourage essential and important entities to use qualified trust services.
Information Sharing:
Member States must enable both covered and non-covered entities to voluntarily share cybersecurity-related information—such as threats, vulnerabilities, and countermeasures—within secure arrangements that respect the sensitive nature of the data exchanged. Entities participating in these arrangements must notify authorities of their involvement, while ENISA supports their development and Member States process voluntary incident notifications without imposing extra obligations.
Supervision and Enforcement:
- Member States must ensure that their competent authorities effectively supervise and take the measures necessary to ensure compliance with the NIS 2 Directive.
- Member States must ensure that the supervisory or enforcement measures imposed on essential and important entities are effective, proportionate and dissuasive, taking into account the circumstances of each individual case. They have the power to subject these entities at least to:
- on-site inspections and off-site supervision, regular and targeted security audits,
- ad hoc audits (only for essential entities);
- security scans;
- requests for information necessary to assess the cybersecurity risk-management measures adopted by the entity concerned;
- requests to access data, documents and information necessary to carry out their supervisory tasks;
- requests for evidence of implementation of cybersecurity policies;
- such as the results of security audits carried out by a qualified auditor and the respective underlying evidence.
- Member States shall ensure that their competent authorities, when exercising their enforcement powers in relation to essential and important entities have the power at least to:
- issue warnings about infringements by the entities concerned;
- adopt binding instructions and time-limits for the implementation of such measures and for reporting on their implementation, or an order requiring the entities concerned to remedy the deficiencies identified or the infringements of the NIS 2 Directive;
- order the entities concerned to cease conduct that infringes the NIS 2 Directive and desist from repeating that conduct; order the entities concerned to ensure that their cybersecurity risk-management measures comply with Article 21 or to fulfil the reporting obligations laid down in Article 23, in a specified manner and within a specified period;
- order the entities concerned to inform the natural or legal persons with regard to which they provide services or carry out activities which are potentially affected by a significant cyber threat of the nature of the threat, as well as of any possible protective or remedial measures which can be taken by those natural or legal persons in response to that threat;
- order the entities concerned to implement the recommendations provided as a result of a security audit within a reasonable deadline;
- designate a monitoring officer with well-defined tasks for a determined period of time to oversee the compliance of the entities concerned with Articles 21 and 23 (only for essential entities);
- order the entities concerned to make public aspects of infringements of this Directive in a specified manner;
- impose, or request the imposition by the relevant bodies, courts or tribunals, in accordance with national law, of an administrative fine pursuant to Article 34 in addition to any of the measures referred to in points (a) to (h) of this paragraph.
Penalties:
Member States shall lay down rules on penalties applicable to infringements of national measures adopted pursuant to this Directive and shall take all measures necessary to ensure that they are implemented. The penalties provided for shall be effective, proportionate and dissuasive.
Administrative fines on essential and important entities for breaching cybersecurity risk-management measures or reporting obligations:
- by essential entities shall not exceed EUR 10 million or 2% of global annual revenue, whichever is higher
- by important entities shall not exceed EUR 7 million or 1,4 % of the total worldwide annual turnover in the preceding financial year of the undertaking to which the important entity belongs, whichever is higher.
Effective Date:
Upon transposition by Member States which should be completed by October 17, 2024
EU DORA Regulation
EU DORA Regulation
Regulation (Eu) 2022/2554 Of The European Parliament And Of The Council of 14 December 2022 on digital operational resilience for the financial sector and amending Regulations (EC) No 1060/2009, (EU) No 648/2012, (EU) No 600/2014, (EU) No 909/2014 and (EU) 2016/1011.
Highlights
Subject Matter:
The EU DORA Regulation lays down uniform requirements concerning the security of network and information systems supporting the business processes of financial entities as follows:
- Requirements applicable to financial entities in relation to:
- (i) information and communication technology (ICT) risk management;
- (ii) reporting of major ICT-related incidents and notifying, on a voluntary basis, significant cyber threats to the competent authorities;
- (iii) reporting of major operational or security payment-related incidents to the competent authorities by financial entities referred to in Article 2(1), points (a) to (d);
- (iv) digital operational resilience testing;
- (v) information and intelligence sharing in relation to cyber threats and vulnerabilities;
- (vi) measures for the sound management of ICT third-party risk;
- requirements in relation to the contractual arrangements concluded between ICT third-party service providers and financial entities;
- rules for the establishment and conduct of the Oversight Framework for critical ICT third-party service providers when providing services to financial entities;
- rules on cooperation among competent authorities, and rules on supervision and enforcement by competent authorities in relation to all matters covered by this Regulation
Scope:
The EU DORA Regulation applies to financial entities, i.e.:
- credit institutions; payment institutions, account information service providers; electronic money institutions; investment firms; crypto-asset service providers as authorised the EU Regulation on markets in crypto-assets and issuers of asset-referenced tokens; central securities depositories; central counterparties; trading The EU DORA Regulation applies to financial entities, i.e.: venues; trade repositories; managers of alternative investment funds; management companies; data reporting service providers; insurance and reinsurance undertakings; insurance intermediaries, reinsurance intermediaries and ancillary insurance intermediaries; institutions for occupational retirement provision; credit rating agencies; administrators of critical benchmarks; crowdfunding service providers; securitisation repositories; ICT third-party service provider.
The EU DORA does not apply to:
- managers of alternative investment funds as referred to in Article 3(2) of Directive 2011/61/EU; insurance and reinsurance undertakings as referred to in Article 4 of Directive 2009/138/EC; institutions for occupational retirement provision which operate pension schemes which together do not have more than 15 members in total; natural or legal persons exempted pursuant to Articles 2 and 3 of Directive 2014/65/EU; insurance intermediaries, reinsurance intermediaries and ancillary insurance intermediaries which are microenterprises or small or medium-sized enterprises; post office giro institutions as referred to in Article 2(5), point (3), of Directive 2013/36/EU. L 333/24 EN Official Journal of the European Union 27.12.2022.
Key Definitions:
- Digital operational resilience: the ability of a financial entity to build, assure and review its operational integrity and reliability by ensuring, either directly or indirectly through the use of services provided by ICT third-party service providers, the full range of ICT-related capabilities needed to address the security of the network and information systems which a financial entity uses, and which support the continued provision of financial services and their quality, including throughout disruptions.
- ICT: information, communications and technology.
- ICT risk: any reasonably identifiable circumstance in relation to the use of network and information systems which, if materialised, may compromise the security of the network and information systems, of any technology dependent tool or process, of operations and processes, or of the provision of services by producing adverse effects in the digital or physical environment.
- ICT-related incident: a single event or a series of linked events unplanned by the financial entity that compromises the security of the network and information systems, and have an adverse impact on the availability, authenticity, integrity or confidentiality of data, or on the services provided by the financial entity.
- Significant cyber threat: a cyber threat the technical characteristics of which indicate that it could have the potential to result in a major ICT-related incident or a major operational or security payment-related incident.
ICT Risk Management:
Financial entities:
- Shall have in place an internal governance and control framework that ensures an effective and prudent management of ICT risk in order to achieve a high level of digital operational resilience. The management body of the financial entity will define, approve, oversee and be responsible for the implementation of all arrangements related to the ICT risk management framework.
- Are required to implement a robust and well-documented ICT risk management framework within their broader risk systems. This framework should empower them to manage ICT risks swiftly and effectively, ensuring strong digital resilience. It must include essential strategies, policies, procedures, protocols, and tools to safeguard all information and ICT assets.
- In order to address and manage ICT risk, shall use and maintain updated ICT systems, protocols and tools.
- Shall identify, classify and adequately document all ICT supported business functions, roles and responsibilities, the information assets and ICT assets supporting those functions, and their roles and dependencies in relation to ICT risk. Financial entities shall review as needed, and at least yearly, the adequacy of this classification and of any relevant documentation
- Shall continuously monitor and control the security and functioning of ICT systems and tools and shall minimise the impact of ICT risk on ICT systems through the deployment of appropriate ICT security tools, policies and procedures
- Shall have in place mechanisms to promptly detect anomalous activities, including ICT network performance issues and ICT-related incidents, and to identify potential material single points of failure. These mechanisms shall be regularly tested.
- Shall put in place a comprehensive ICT business continuity policy, which may be adopted as a dedicated specific policy, forming an integral part of the overall business continuity policy of the financial entity and implement associated ICT response and recovery plans.
- Shall develop backup policies and procedures, restoration and recovery procedures and methods.
- Shall have in place capabilities and staff to train its staff and put in pace post ITC-related incident reviews which will allow the financial entity to identify improvements to be made and to assess whether the established procedures were followed and effective.
- Financial entities must embed into their ICT risk management framework a comprehensive communication strategy—covering crisis communication plans, tailored internal and external policies, and a designated individual responsible for managing disclosures and media relations in the event of ICT-related incidents.
Reporting Of Major ICT-Related Incidents and Voluntary Notification:
Financial entities shall define, establish and implement an ICT-related incident management process to detect, manage and notify ICT-related incidents. Financial entities shall classify ICT-related incidents and shall determine their impact based on criteria set by the DORA Regulation:
- the number or relevance of clients or financial counterparts affected.
- the duration of the ICT-related incident, including the service downtime.
- the geographical spread with regard to the areas affected by the ICT-related incident. particularly if it affects more than two Member State.
- the data losses that the ICT-related incident entails, in relation to availability, authenticity, integrity or confidentiality of data.
- the criticality of the services affected.
- the economic impact, in particular direct and indirect costs and losses, of the ICT-related incident in both absolute and relative terms.
They will classify cyber threats and identify significant ICT-related incidents according to the criteria set out in the DORA Regulation and regulatory technical standard drafted by the European Supervisory Authorities.
Financial entities may, on a voluntary basis, notify significant cyber threats to the relevant competent authority when they deem the threat to be of relevance to the financial system, service users or clients.
In total, the financial entities shall submit to the competent authority:
- an initial notification which will be transmitted by the competent authority to the European Banking Authority (EBA), European Securities and Markets Authority (ESMA) or European Insurance and Occupational Pensions Authority (EIOPA), in certain circumstances, the European Central Bank (ECB), and other recipients listed in Article 19(6) of the DORA Regulation.
- an intermediate report as soon as the status of the original incident has changed significantly or the handling of the major ICT-related incident has changed based on new information available, followed, as appropriate, by updated notifications every time a relevant status update is available, as well as upon a specific request of the competent authority;
- a final report, when the root cause analysis has been completed, regardless of whether mitigation measures have already been implemented, and when the actual impact figures are available to replace estimates.
Digital Operational Resilience Testing:
For the purpose of assessing preparedness for handling ICT-related incidents, of identifying weaknesses, deficiencies and gaps in digital operational resilience, and of promptly implementing corrective measures, financial entities, other than microenterprises, shall, taking into account the criteria set out in Article 4(2), establish, maintain and review a sound and comprehensive digital operational resilience testing programme as an integral part of the ICT risk-management framework.
The digital operational resilience testing programme shall include a range of assessments, tests, methodologies, practices and tools in accordance with Articles 25 and 26 on the testing of ICT tools and systems, advanced testing of ICT tools, systems and processes based on threat-led penetration testing (TLPT)., and on the requirements for testers for the carrying of the TLPT.
The testing shall include, among other things:
- vulnerability analyses;
- network security assessments;
- physical security reviews;
- end-to-end crisis simulation tests;
- threat-based cyber penetration tests.
They will be realized at least once per year.
Risk Management Associated With Third-Party ICT Service Providers and the Information Register:
Financial entities shall manage ICT third-party risk as an integral component of ICT risk within their ICT risk management framework.
Financial entities that have in place contractual arrangements for the use of ICT services to run their business operations shall, at all times, remain fully responsible for compliance with, and the discharge of, all obligations under this Regulation and applicable financial services law;
The rights and obligations of the financial entity and of the ICT third-party service provider shall be clearly allocated and set out in writing. The full contract shall include the service level agreements and be documented in one written document which shall be available to the parties on paper, or in a document with another downloadable, durable and accessible format. It must include the elements set out in Article 30(2).
Financial entities’ management of ICT third-party risk shall be implemented in light of the principle of proportionality, taking into account:
- (i) the nature, scale, complexity and importance of ICT-related dependencies,
- (ii) the risks arising from contractual arrangements on the use of ICT services concluded with ICT third-party service providers, taking into account the criticality or importance of the respective service, process or function, and the potential impact on the continuity and availability of financial services and activities, at individual and at group level.
Before entering into a contractual arrangement on the use of ICT services, financial entities shall:
- assess whether the contractual arrangement covers the use of ICT services supporting a critical or important function;
- assess if supervisory conditions for contracting are met;
- identify and assess all relevant risks in relation to the contractual arrangement, including the possibility that such contractual arrangement may contribute to reinforcing ICT concentration risk as referred to in Article 29;
- undertake all due diligence on prospective ICT third-party service providers and ensure throughout the selection and assessment processes that the ICT third-party service provider is suitable;
- identify and assess conflicts of interest that the contractual arrangement may cause.
Financial entities may only enter into contractual arrangements with ICT third-party service providers that comply with appropriate information security standards. When those contractual arrangements concern critical or important functions, financial entities shall, prior to concluding the arrangements, take due consideration of the use, by ICT third party service providers, of the most up-to-date and highest quality information security standard.
Financial entities shall notify the competent authority at least once a year of new contractual agreements relating to the use of ICT services. Draft agreements relating to ICT services supporting critical functions must also be communicated;
Financial entities shall ensure that a contract can be terminated in certain circumstances, in particular where the third-party provider has certain deficiencies in its ICT risk management;
Financial entities shall establish exit strategies for ICT services supporting critical or important functions in the event of provider failure.
As part of their ICT risk management framework, financial entities shall maintain and update at entity level, and at sub-consolidated and consolidated levels, a register of information in relation to all contractual arrangements on the use of ICT services provided by ICT third-party service provider.
Penalties:
It is left to each Member State to decide on the penalty. They may adopt ‘any type of measure, including financial penalties, to ensure that financial entities continue to comply with their legal obligations.’
Effective Date:
January 17, 2025
EU Cybersecurity Act, Data Act, & e-Privacy Directive
EU Cybersecurity Act | Data Act | e-Privacy Directive
EU Cybersecurity Act
Regulation (EU) 2019/881 of the European Parliament and of the Council of 17 April 2019 on ENISA (the European Union Agency for Cybersecurity) and on information and communications technology cybersecurity certification and repealing Regulation (EU) No 526/2013 (Cybersecurity Act) (Text with EEA relevance.)
Highlights
Subject Matter:
- Objectives, tasks and organisational matters relating to ENISA (the European Union Agency for Cybersecurity); and
- Framework for the establishment of European cybersecurity certification schemes for the purpose of ensuring an adequate level of cybersecurity for ICT products, ICT services and ICT processes in the Union, as well as for the purpose of avoiding the fragmentation of the internal market with regard to cybersecurity certification schemes in the EU.
Cybersecurity Certification Framework:
- The Commission shall publish a list of ICT products, ICT Services and ICT processes or categories capable of benefiting from being included in the scope of a European cybersecurity certification scheme.
- The certification scheme will specify one or more of the following assurance levels for ICT products, ICT services and ICT processes:
- Basic (for e.g. internet of objects)
- Substantial
- High
- The certification scheme may allow for a conformity self-assessment under the sole responsibility of the manufacturer or provider of ICT products, ICT services or ICT processes, only when they present a low risk which corresponds to assurance level ‘basic.’
- The manufacturer or provider of certified ICT products, ICT services or ICT processes or of ICT products, ICT services and ICT processes for which an EU statement of conformity has been issued shall make publicly available the following supplementary cybersecurity information:
- Guidance and recommendations to assist end users with the secure configuration, installation, deployment, operation and maintenance of the ICT products or ICT services;
- The period during which security support will be offered to end users, in particular as regards the availability of cybersecurity related updates;
- Contact information of the manufacturer or provider and accepted methods for receiving vulnerability information from end users and security researchers;
- A reference to online repositories listing publicly disclosed vulnerabilities related to the ICT product, ICT service or ICT process and to any relevant cybersecurity advisories.
- The cybersecurity certification is voluntary unless the EU Law of Member State law says otherwise. ICT products, ICT services and ICT processes that have been certified under a European cybersecurity certification scheme adopted pursuant to Article 49 shall be presumed to comply with the requirements of such scheme.
Effective Date:
June 27, 2019
Data Act
Regulation (EU) 2023/2854 of the European Parliament and of the Council of 13 December 2023 on harmonised rules on fair access to and use of data and amending Regulation (EU) 2017/2394 and Directive (EU) 2020/1828 (Data Act) (Text with EEA relevance.)
Highlights
Subject Matter and Scope:
- Subject Matter:
- making available product data and related data to the user of the connected product or related service.
- making available of data by data holders to data recipients.
- facilitating switching between data processing services.
- the development of interoperability standards for data to be accessed, transferred and used.
- Scope:
- Personal and non-personal data, as distinguished by the GDPR.
- Applies to manufacturers of connected products (e.g. Internet of Things) placed on the EU market and providers of related services, no matter the place of establishment of the manufacturers or providers.
- Users in the EU of connected products or related services.
- Data holders that make data available to data recipients in the Union, no matter the data holders’ place of establishment.
- Providers of data processing services to customers in the EU, irrespective of the providers’ place of establishment,
- Participants in data spaces and vendors of applications using smart contracts and persons whose trade, business or profession involves the deployment of smart contracts for others in the context of executing an agreement.
- Interaction with other EU legislations:
- Article 1(5) states that the Data Act Regulation does not supersede the EU General Data Protection Regulation (GDPR) when personal data is processed, nor the e-Privacy Directive (known as the “Cookie Law”). The Data Act provisions set in Chapter II of the Regulation complement the right of access by data subjects and the right to data portability of the GDPR.
- In case of conflict between the Data Act Regulation and other EU legislations, the Data Act indicates that “the relevant Union law on the protection of personal data or privacy shall prevail.” It is therefore important to, case by case, compare the GDPR to the Data Act to ensure the application of the provisions of one does not enter into conflict with the application of the other legislation.
Key Definitions:
- Connected product: an item that obtains, generates or collects data concerning its use or environment and that is able to communicate product data via an electronic communications service, physical connection or on-device access, and whose primary function is not the storing, processing or transmission of data on behalf of any party other than the user.
- Data: any digital representation of acts, facts or information and any compilation of such acts, facts or information, including in the form of sound, visual or audio-visual recording.
- Data holder: a natural or legal person that has the right or obligation, in accordance with this Regulation, applicable Union law or national legislation adopted in accordance with Union law, to use and make available data, including, where contractually agreed, product data or related service data which it has retrieved or generated during the provision of a related service.
- Interoperability: the ability of two or more data spaces or communication networks, systems, connected products, applications, data processing services or components to exchange and use data in order to perform their functions.
- Metadata: a structured description of the contents or the use of data facilitating the discovery or use of that data.
- Non-personal data: data other than personal data.
- Personal data: Personal data are any information which are related to an identified or identifiable natural person. The data subjects are identifiable if they can be directly or indirectly identified, especially by reference to an identifier such as a name, an identification number, location data, an online identifier or one of several special characteristics, which expresses the physical, physiological, genetic, mental, commercial, cultural or social identity of these natural persons.
- Product data: “data generated by the use of a connected product that the manufacturer designed to be retrievable, via an electronic communications service, physical connection or on-device access, by a user, data holder or a third party, including, where relevant, the manufacturer.
- Processing: any operation or set of operations which is performed on data or on sets of data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination, or other means of making them available, alignment or combination, restriction, erasure or destruction.
- Related service: a digital service, other than an electronic communications service, including software, which is connected with the product at the time of the purchase, rent or lease in such a way that its absence would prevent the connected product from performing one or more of its functions, or which is subsequently connected to the product by the manufacturer or a third party to add to, update or adapt the functions of the connected product.
- Related service data: data representing the digitisation of user actions or of events related to the connected product, recorded intentionally by the user or generated as a by-product of the user’s action during the provision of a related service by the provider.
- User: a natural or legal person that owns a connected product or to whom temporary rights to use that connected product have been contractually transferred, or that receives related services.
Main Provisions Related to Personal Data:
Make Product Data and Related Service Data Accessible to the User (Business-to-Consumer and Business-to-Business)
- When designing and manufacturing the product. When possible, by making it directly accessible to the user. It needs to be made available easily, securely, free of charge, in a comprehensive, structured, commonly used and machine-readable format.
- Before concluding a contract for the purchase, rent or lease of a connected product, the seller, rentor or lessor needs to provide certain pieces of information on the product data and related service data.
- Before concluding a contract for the provision of a related service, the provider must provide certain pieces of information to the user in a clear and comprehensible manner.
- Where data cannot be directly accessed by the user from the connected product or related service, data holders need to make it readily available, as well at the metadata necessary to interpret and use the data, accessible to the user.
- If the user is not the data subject whose personal data is requested, any personal data generated by the use of a connected product of related service shall be made available by the data holder to the user only where:
- there is a valid legal basis for processing under Article 6 of the GDPR;
- there is a valid legal basis for the processing of special categories of personal data where applicable, under Article 9 of the GDPR;
- and there the user has given his prior consent where the e-Privacy directive applies.
- The user has a right to share data with third parties.
- A third party can only process the personal data made available to it when it has complied to GDPR rules first.
- The Data Act expands on obligations for data holder to make data available pursuant to EU Law: for business-to-business relationships it must be done under a fair, reasonable and non-discriminatory terms and conditions, and in a transparent manner. A compensation can be agreed upon. It also allows data holders to apply appropriate technical protection measures (e.g. contracts and encryption) to prevent unauthorized access to data and metadata.
- The Data Act expands the definition of unfair contractual terms related to data access and use between enterprises.
- Public sector bodies, the Commission, the European Central Bank and EU bodies can request making data available on the basis of an exceptional need. For processing of personal data, it is necessary these bodies must share information related to the finality of the processing and must specify any technical and organizational measures necessary and proportionate to implement data protection principles and necessary safeguards, such as pseudonymization, and whether anonymization can be applied by the data holder before making the data available. The data holder will then anonymize the data, unless compliance with the request to make data available requires the disclosure of personal data. The data holders are entitled to fair compensation for complying with the request.
Switching Between Data Processing Services
- Providers of data processing services shall take measures to enable customers to switch to a data processing service, covering the same service type, which is provided by a different provider of data processing services, or to on-premises ICT infrastructure, or, where relevant, to use several providers of data processing services at the same time.
- The rights of the customer and the obligations of the provider of data processing services in relation to switching between providers of such services or, where applicable, to an on-premises ICT infrastructure shall be clearly set out in a written contract. The provider of data processing services shall make that contract available to the customer prior to signing the contract in a way that allows the customer to store and reproduce the contract. The Data Act requires the contract to include certain clauses.
- Providers of data processing services must provide the customer with certain information.
- There is an obligation applying to all parties to cooperate in good faith to make the switching process effective.
- The Data Act regulation includes contractual transparency obligations on international access and transfer that apply to providers of data processing services.
- The gradual withdrawal of switching charges is as follows:
- From 12 January 2027, providers of data processing services shall not impose any switching charges on the customer for the switching process. 2.
- From 11 January 2024 to 12 January 2027, providers of data processing services may impose reduced switching charges on the customer for the switching process.
- The Data Act regulation requires certain technical aspects of switching to be met by providers of data processing services.
Interoperability
- The Data Act regulation sets out essential requirements regarding: interoperability of data, of data sharing mechanisms and services, as well as of common European data spaces; interoperability for the purpose of in-parallel use of data processing services; interoperability of data processing services; and smart contracts for executing data sharing agreements.
Implementation and Enforcement
- The supervisory authorities responsible for monitoring the application of the GDPR in each Member State shall be responsible for monitoring the application of this Regulation insofar as the protection of personal data is concerned.
- Natural and legal persons have the right to lodge a complaint and the right to an effective judicial remedy.
- Penalties are to be laid down by each Member State.
- The Commission will develop and recommend non-binding model contractual terms on data access and use.
Effective Date:
- For most obligations, September 12, 2025.
- Obligations relating to the design and manufacturing of connected products will apply to the products and connected services placed on the market after September 12, 2026.
e-Privacy Directive
Directive 2002/58/EC of The European Parliament and of The Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications).
Highlights
Subject Matter and Scope:
- The e-Privacy directive harmonises Member States provisions concerning the right to privacy, with respect to the processing of personal data in the electronic communication sector and to ensure the free movement of such data and of electronic communication equipment and services in the EU.
- The e-Privacy directive applies to the processing of personal data in connection with the provision of publicly available electronic communications services in public communications networks in the EU.
Key Definitions:
- Traffic data: “any data processed for the purpose of the conveyance of a communication on an electronic communications network or for the billing thereof.”
- Location data: “any data processed in an electronic communications network, indicating the geographic position of the terminal equipment of a user of a publicly available electronic communications service.”
Main Provisions Related to Personal Data Protection:
Confidentiality of Communications
- Listening, tapping storage or other kinds of interception or surveillance of communications made via a public communications network and via publicly available electronic communication services are prohibited, unless:
- They have given their prior consent.
- Legally authorized to do so, such as in the course of lawful business practice, for the purpose of providing evidence of a commercial transaction or of any other business communication.
- Storing or accessing data on a user's device through electronic communications networks is only allowed if:
- The subscriber or user has been provided with clear and comprehensive information about the purposes of the processing and the user is given a real choice to say no; Or
- When the storage/access is:
- Solely technical or for transmitting a communication ; Or
- Strictly necessary in order to provide the service explicitly requested by the subscriber or user.
Data Retention for Traffic Data and Location Data
- Traffic data
- General rule for traffic data: data relating to subscribers and users processed and stored by the provider of a public communications network or publicly available electronic communications service must be erased or made anonymous when it is no longer needed for the purpose of the transmission of a communication.
- Exception to the general traffic data rule: the Marketing services rule: the provider of a publicly available electronic communications service may process the data to the extent and for the duration necessary for such services or marketing, only if:
- the subscriber or user to whom the data relate has given his/her prior consent;
- users or subscribers have been given the possibility to withdraw their consent for the processing of traffic data at any time; and
- users or subscribers have been informed by the service provider of the types of traffic data which are processed and the duration of the processing.
- Location data
- General rule for location data: data relating to users or subscribers of public communications networks or publicly available electronic communications services, can be processed, such data may only be processed when they are made anonymous for the duration necessary.
- Exception to the general rule for location data:
- the service provider has informed the users or subscribers of the specific type of location data that will be processed, as well as whether the data will be transmitted to a third party for the purpose of providing the value added tax;
- the service provider has obtained prior consent of users or subscribers; and
- the user has the possibility of temporarily refusing the processing of the data for each connection to the processing of the location data or for each transmission of a communication. This possibility must be accessible by a simple means, and free of charge.
Penalties:
Each Member State can implement their penalties.
Effective Date:
Upon transposition by Member States which shall have been completed by July 31, 2002
EU Cyber Resilience Act, EU AI Act, EU DMA, & EU DSA
EU Cyber Resilience Act | EU AI Act | EU DMA | EU DSA
EU Cyber Resilience Act
Regulation (Eu) 2024/2847 of the European Parliament and of the Council of 23 October 2024 on horizontal cybersecurity requirements for products with digital elements and amending Regulations (EU) No 168/2013 and (EU) No 2019/1020 and Directive (EU) 2020/1828 (Cyber Resilience Act.)
Highlights
Subject Matter:
- Rules for the making available on the market of products with digital elements to ensure the cybersecurity of such products.
- Essential cybersecurity requirements for the design, development and production of products with digital elements, and obligations for economic operators in relation to those products with respect to cybersecurity.
- Essential cybersecurity requirements for the vulnerability handling processes put in place by manufacturers to ensure the cybersecurity of products with digital elements during the time the products are expected to be in use, and obligations for economic operators in relation to those processes.
- Rules on market surveillance, including monitoring, and enforcement of the rules and requirements under the Cyber Resilience Act.
Scope:
The CRA regulation applies to products with digital elements made available on the market, the intended purpose or reasonably foreseeable use of which includes a direct or indirect logical or physical data connection to a device or network. For instance, free and open-source software.
See Article 2 of the CRA for products with digital elements that are not covered by the regulation. They mainly involve software as a service, certain areas that are already covered by specific legislation (medical devices, land vehicles, and aircrafts), and products falling within the sovereign powers of Member States.
Obligations apply to:
- manufacturers of products put on the market under their own name or trademark (articles 13, 14 and 15 on general obligations and reporting obligations on manufacturers);
- distributors who make available a product on the EU market (article 19); and
- importers of products with digital elements (article 20).
Key definitions:
- Manufacturer: a natural or legal person who develops or manufactures products with digital elements or has products with digital elements designed, developed or manufactured, and markets them under its name or trademark, whether for payment, monetisation or free of charge.
- Product with digital elements: a software or hardware product and its remote data processing solutions, including software or hardware components being placed on the market separately.
- A severe incident having an impact on the security of the product with digital elements: an incident which:
- negatively affects or is capable of negatively affecting the ability of a product with digital elements to protect the availability, authenticity, integrity or confidentiality of sensitive or important data or functions; or
- has led or is capable of leading to the introduction or execution of malicious code in a product with digital elements or in the network and information systems of a user of the product with digital elements.
- CSIRT: Computer Security Incident Response Team.
- ENISA: The EU agency for Cybersecurity.
Types of products and requirements:
- Products with digital elements: shall be made available on the market only where:
- they meet the essential cybersecurity requirements set out in Part I of Annex I, provided that they are properly installed, maintained, used for their intended purpose or under conditions which can reasonably be foreseen, and, where applicable, the necessary security updates have been installed; and
- the processes put in place by the manufacturer comply with the essential cybersecurity requirements set out in Part II of Annex I of the Cyber-Resilience Act.
- Important products with digital elements: Products with digital elements which have the core functionality of a product category set out in Annex III of the CRA Regulation. They shall be subject to the conformity assessment procedures referred to in Article 32(2) and (3).
- Critical products with digital elements: Products with digital elements which have a core functionality and that must obtain, in compliance with delegated acts adopted by the Commission, a European cybersecurity certificate at assurance level at least ‘substantial’ under a European cybersecurity certification scheme adopted pursuant to Regulation (EU) 2019/881, to demonstrate conformity with essential cybersecurity requirements set out in Annex I of the CRA Regulation.
Main Obligations of Manufacturers:
General Obligations:
- When placing a product with digital elements on the market, ensure that it has been designed, developed and produced in accordance with the essential cybersecurity requirements, based on a risk assessment.
- Undertake and document an assessment of the cybersecurity risks associated with a product with digital elements.
- Exercise due diligence when integrating components sourced from third parties.
- Upon identifying a vulnerability in a component, including in an open source-component, which is integrated in the product with digital elements report the vulnerability to the person or entity manufacturing or maintaining the component, and address and remediate the vulnerability in accordance with the vulnerability handling requirements set out in Part II of Annex I.
- Update the cybersecurity risk assessment of products when becoming aware of relevant cybersecurity aspects concerning the products with digital elements.
- Keep the technical documentation and the EU declaration of conformity at the disposal of the market surveillance authorities for at least 10 years after the product with digital elements has been placed on the market or for the support period, whichever is longer.
- Manufacturers shall comply with some transparency obligations.
- Manufacturers shall cooperate with request made by the market surveillance authority.
Reporting obligations of manufacturers:
- Manufacturers are required to notify any actively exploited vulnerability contained in the product with digital elements that it becomes aware of simultaneously to the CSIRT designated as coordinator, and to ENISA. The manufacturer shall notify that actively exploited vulnerability via the single reporting platform established by the CRA Act.
- The manufacturer shall submit:
- An early warning notification of an actively exploited vulnerability, without undue delay and in any event within 24 hours of the manufacturer becoming aware of it. Where applicable, the manufacturer should include the Member States in which their product with digital elements has been made available.
- If the information has not been provided, a vulnerability notification without undue delay and in any event within 72 hours of the manufacturer becoming aware of the actively exploited vulnerability. The notification shall contain general information about:
- The product with digital elements concerned
- The general nature of the exploit and of the vulnerability concerned
- Any corrective or mitigating measures taken
- Corrective or mitigating measures that users can take; and
- Where applicable, how sensitive the manufacturer considers the notified information to be.
- Unless the relevant information has already been provided, a final report, no later than 14 days after a corrective or mitigating measure is available, including at least the following:
- a description of the vulnerability, including its severity and impact;
- where available, information concerning any malicious actor that has exploited or that is exploiting the vulnerability; and
- details about the security update or other corrective measures that have been made available to remedy the vulnerability.
- The manufacturer shall submit:
- Manufacturers are required to notify any severe incident having an impact on the security of the product with digital elements that it becomes aware of simultaneously to the CSIRT designated as coordinator, in accordance with paragraph 7 of this Article, and to ENISA. The manufacturer shall notify that incident via the single reporting platform established by the CRA regulation.
- The manufacturer shall submit:
- an early warning notification without undue delay and in any event within 24 hours of the manufacturer becoming aware of it, including at least:
- whether the incident is suspected of being caused by unlawful or malicious acts; and
- where applicable, the Member States on the territory of which the manufacturer is aware that their product with digital elements has been made available.
- unless the relevant information has already been provided, an incident notification, without undue delay and in any event within 72 hours of the manufacturer becoming aware of the incident, which must provide general information as set out in article 14(4)(b).
- unless the relevant information has already been provided, a final report, within one month after the submission of the incident notification. It must include certain information set out in article 14(4)(c).
- an early warning notification without undue delay and in any event within 24 hours of the manufacturer becoming aware of it, including at least:
- The manufacturer shall submit:
- After becoming aware of an actively exploited vulnerability or a severe incident having an impact on the security of the product with digital elements, the manufacturer shall inform the impacted users:
- of the product with digital elements and of the vulnerability or incident; and
- where necessary, of any risk mitigation and corrective measures that the users can deploy to mitigate the impact of that vulnerability or incident, where appropriate in a structured, machine-readable format that is easily automatically processable.
Conformity of the Product with Digital Elements:
Compliance may be achieved through the following means:
- Presumption of conformity: in the case of products with digital elements and processes put in place by the manufacturer which are in conformity with harmonised standards or parts thereof.
- EU declaration of conformity: drawn up by manufacturers in accordance with Article 13(12) of the CRA Regulation and state that the fulfilment of the applicable essential cybersecurity requirements set out in Annex I has been demonstrated.
- CE marking.
- Technical documentation.
- Conformity assessment procedures for products with digital elements.
- Support measures for microenterprises and small and medium-sized enterprises, including start-ups.
- Mutual recognition agreements.
Notification of Conformity Assessment Bodies:
Each Member State shall designate a notifying authority that shall be responsible for setting up and carrying out the necessary procedures for the assessment, designation and notification of conformity assessment bodies and their monitoring, including compliance with subsidiaries.
Market Surveillance and Enforcement:
Market surveillance authorities have the power to:
- carry out, in cooperation with the relevant computer security incident response team (CSIRT), evaluations of the product with digital elements that it has sufficient reasons to consider to present a significant cybersecurity risk, or when informed by the Commission of a product with digital elements that presents a significant cybersecurity risk and that does not comply with the CRA Regulation;
- require manufacturers to put an end to the non-compliance concerning:
- CE markings that have not been affixed according to the requirements set out in the CRA Act;
- CE markings that have not been affixed;
- EU declaration of conformity that have not been drawn up or not been drawn up correctly;
- The identification number of the notified body involved in the conformity assessment procedure has not been affixed;
- The technical documentation is either not available or not complete.
- conduct simultaneous coordinated control actions (sweeps) of particular products with digital elements or categories thereof to check compliance with or to detect infringements to this Regulation;
- require an economic actor to take all appropriate measures where a product is compliant but presents a significant cybersecurity risk, as well as a risk to the health or safety of persons, the compliance with obligations under EU or national law intended to protect fundamental rights, the availability, authenticity, integrity or confidentiality of services offered using an electronic information system by essential entities under the NIS 2 Directive, or other aspects of public interest protection; and
- where the non-compliance persists, the Member State concerned, take all appropriate measures to restrict or prohibit the product with digital elements from being made available on the market or ensure that it is recalled or withdrawn from the market.
Penalties:
Penalties shall go up to:
- EUR 15 million or 2.5% of total turnover in the previous financial year, whichever is higher, in case of non-compliance with the essential cybersecurity requirements and obligations of manufacturers.
- EUR 10 million or 2% of total turnover in the previous financial year, whichever is higher, in case of non-compliance with obligations.
- EUR 5 million or 1% of total turnover in the previous financial year, whichever is higher, in case of the supply of incorrect, incomplete or misleading information to notified bodies and market surveillance authorities in reply to a request.
Effective Date:
- June 11, 2026 for reporting obligations of manufacturers under article 14
- December 11, 2027 for main obligations.
EU AI Act
Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act).
Highlights
Subject Matter:
- Purpose of the EU AI Act: improve the functioning of the internal market and promote the uptake of human-centric and trustworthy artificial intelligence (AI), while ensuring a high level of protection of health, safety, fundamental rights enshrined in the Charter (…) against the harmful effects of AI systems in the EU and supporting innovation.
- The EU AI Act lays down
- harmonised rules for the placing on the market, the putting into service, and the use of AI systems in the Union;
- prohibitions of certain AI practices;
- specific requirements for high-risk AI systems and obligations for operators of such systems;
- harmonised transparency rules for certain AI systems;
- harmonised rules for the placing on the market of general-purpose AI models;
- rules on market monitoring, market surveillance, governance and enforcement;
- measures to support innovation, with a particular focus on SMEs, including start-ups.
Scope:
The EU AI Act applies to:
- Providers placing on the market or putting into service AI systems or placing on the market general-purpose AI models in the Union, irrespective of whether those providers are established or located within the Union or in a third country;
- Deployers of AI systems that have their place of establishment or are located within the Union;
- Providers and deployers of AI systems that have their place of establishment or are located in a third country, where the output produced by the AI system is used in the Union;
- Importers and distributors of AI systems;
- Product manufacturers placing on the market or putting into service an AI system together with their product and under their own name or trademark;
- Authorised representatives of providers, which are not established in the Union;
- Affected persons that are located in the Union.
Key Definitions:
- AI system: machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments;
- Authorised representative: a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation;
- Deployer: a natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity;
- Distributor: a natural or legal person in the supply chain, other than the provider or the importer, that makes an AI system available on the Union market;
- General-Purpose AI model: an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market
- High-Risk AI Systems: systems as defined by Articles 6(1) and 6(2).
- Importer: a natural or legal person located or established in the Union that places on the market an AI system that bears the name or trademark of a natural or legal person established in a third country;
- Making available on the market: the supply of an AI system or a general-purpose AI model for distribution or use on the Union market in the course of a commercial activity, whether in return for payment or free of charge;
- Placing on the market: the first making available of an AI system or a general-purpose AI model on the Union market;
- Personal data: personal data as defined in Article 4 of the GDPR, i.e.: “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”
- Provider: a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge;
- Putting into service: the supply of an AI system for first use directly to the deployer or for own use in the Union for its intended purpose
- Regulatory sandbox: a controlled framework set up by a competent authority which offers providers or prospective providers of AI systems the possibility to develop, train, validate and test, where appropriate in real-world conditions, an innovative AI system, pursuant to a sandbox plan for a limited time under regulatory supervision.
Personal Data Obligations:
Link with the GDPR: If personal data is processed through an AI system or model, both its providers and those who deploy it must continue to meet their obligations under the GDPR, in their capacity as (joint) controllers or processors.
Main EU AI Act Obligations:
- Human oversight
- Providers and deployers of high-risk AI systems are required to integrate human oversight into the design from the outset. Automated decisions based on AI-generated content must be supplemented by verification by at least two natural persons.
- Assessments
- Deployers of high-risk AI systems must perform a fundamental rights impact assessment.
- Data governance
- Training, validation and testing data sets must be subject to data governance and management practices appropriate for the intended purpose of the high-risk AI system. Those practices shall concern in particular: data collection processes and the origin of data, and in the case of personal data, the original purpose of the data collection.
- Data quality
- Providers of high-risk AI systems may process special categories of personal data only when strictly necessary to detect and correct biases. Such processing must be carried out with appropriate safeguards to protect the fundamental rights and freedoms of individuals. The cumulative conditions set out by Article 10(5) must be met.
- Regulatory sandboxes and testing
- In the AI regulatory sandbox, personal data lawfully collected for other purposes may be processed solely for the purpose of developing, training and testing certain AI systems in the sandbox when all of the conditions set out in Article 59(1) of the EU AI Act are met.
- Once the testing in real world conditions has been performed, the subjects’ personal data is to be deleted.
- Subjects of the testing in real world conditions may withdraw from the testing at any time by revoking their informed consent and may request immediate and permanent deletion of their personal data.
Cybersecurity Obligations:
High-risk AI systems:
- Risk management
- Risk management system: high-risk AI Systems deployers must establish, implement and document a risk management system. It will be a continuous iterative process, planned and run throughout the entire lifecycle of the high-risk system and will require regular systematic review. It will comprise different steps described at Article 9 of the EU AI Act.
- Risk management measures are those that, once implemented, result only in residual risks deemed acceptable.
- Security by design
- Providers of high-risk AI systems must adapt the AI system to the level of risk, so as to limit it to acceptable risks.
- High-risk AI systems must be designed and developed in such a way that they achieve an appropriate level of accuracy, robustness, and cybersecurity, and that they perform consistently in those respects throughout their lifecycle.
- High-risk AI systems must be as resilient as possible regarding errors, faults or inconsistencies that may occur within the system or the environment in which the system operates, in particular due to their interaction with natural persons or other systems. Technical and organisational measures shall be taken in this regard.
- Documentation
- Technical documentation of a high-risk AI system must be drawn up before that system is placed on the market or put into service and must be kept up-to-date. It must demonstrate that the high-risk AI system complies with the cybersecurity requirements.
- High-risk AI systems must be accompanied by instructions for use in an appropriate digital format or otherwise that include concise, complete, correct and clear information that is relevant, accessible and comprehensible to deployers. It must contain certain information set out in the EU AI Act.
- Data governance
- High-risk AI systems which make use of techniques involving the training of AI models with data shall be developed on the basis of training, validation and testing data sets that meet the quality criteria referred to in Article 10 paragraphs 2 and 5 of the EU AI Act.
- Data quality:
- Training, validation and testing data sets must be relevant, sufficiently representative, and to the best extent possible, free of errors and complete in view of the intended purpose. They must have the appropriate statistical properties, including, where applicable, as regards the persons or groups of persons in relation to whom the high-risk AI system is intended to be used. Those characteristics of the data sets may be met at the level of individual data sets or at the level of a combination thereof.
- Data sets must take into account, to the extent required by the intended purpose, the characteristics or elements that are particular to the specific geographical, contextual, behavioural or functional setting within which the high-risk AI system is intended to be used.
- Personal Data: see data quality obligations in the ‘Personal Data Obligations’ section.
- Record-keeping/registers
- High-risk AI systems shall technically allow for the automatic recording of events (logs) over the lifetime of the system
- Logging capabilities shall enable the recording of events relevant for:
- Identifying situations that may result in the high-risk AI system presenting a risk to the health or safety, or fundamental rights, of persons or in a substantial modification;
- Facilitating the post-market monitoring referred to in the EU AI Act; and
- Monitoring the operation of high-risk AI systems as referred to in the EU AI Act.
- Logging capabilities must, at minimum, provide:
- Recording of the period of each use of the system (start date and time and end date and time of each use);
- The reference database against which input data has been checked by the system;
- The input data for which the search has led to a match;
- The identification of the natural persons involved in the verification of the results, as referred to in the EU AI Act.
- Resilience
- High-risk AI systems must be as resilient as possible regarding errors, faults or inconsistencies that may occur within the system or the environment in which the system operates, in particular due to their interaction with natural persons or other systems. Technical and organisational measures shall be taken in this regard.
- High-risk AI systems shall be resilient against attempts by unauthorised third parties to alter their use, outputs or performance by exploiting system vulnerabilities
- Human oversight
- High-risk AI systems shall be designed and developed in such a way, including with appropriate human-machine interface tools, that they can be effectively overseen by natural persons during the period in which they are in use.
- The instructions for use must contain information on the human oversight measures, including the technical measures put in place to facilitate the interpretation of the outputs of the high-risk AI systems by the deployers;
General-Purpose AI systems:
- Providers of general-purpose AI systems must:
- Draw up and keep up to date technical documentation of the general-purpose AI system.
- Draw up, keep up-to-date and make available information and documentation to providers of AI systems who intend to integrate the general-purpose AI model into their AI systems.
- Comply to their transparency obligations towards users.
- Put in place a policy to comply with Union law on copyright and related rights.
- Mandate an authorized representative of the provider established in the EU in the case of providers established in third countries (non-EU Member States).
- General-Purpose AI systems with systematic risk: On top of the previous obligations, providers of these models shall:
- perform model evaluation in accordance with standardised protocols and tools reflecting the state of the art, including conducting and documenting adversarial testing of the model with a view to identifying and mitigating systemic risks.
- assess and mitigate possible systemic risks at Union level, including their sources, that may stem from the development, the placing on the market, or the use of general-purpose AI models with systemic risk.
- keep track of, document, and report, without undue delay, to the AI Office and, as appropriate, to national competent authorities, relevant information about serious incidents and possible corrective measures to address them.
- ensure an adequate level of cybersecurity protection for the general-purpose AI model with systemic risk and the physical infrastructure of the model.
- Note: providers of general-purpose AI models with systemic risk may rely on codes of practice within the meaning of the EU AI Act to demonstrate compliance with the obligations.
Penalties:
- Non-compliance with any of the following provisions related to operators or notified bodies, other than those laid down in Articles 5, shall be subject to administrative fines of up to 15 000 000 EUR or, if the offender is an undertaking, up to 3 % of its total worldwide annual turnover for the preceding financial year, whichever is higher:
- obligations of providers of high-risk AI systems pursuant to Article 16;
- obligations of authorised representatives pursuant to Article 22;
- obligations of importers of high-risk systems pursuant to Article 23;
- obligations of distributors of high-risk systems pursuant to Article 24;
- obligations of deployers of high-risk systems pursuant to Article 26;
- requirements and obligations of notified bodies pursuant to Article 31, Article 33(1), (3) and (4) or Article 34;
- transparency obligations for providers and deployers of general-purpose AI systems pursuant to Article 50.
- Providing incorrect, incomplete, or misleading information: up to €7,500,000 or 1% of total worldwide annual turnover for the preceding financial year, whichever is higher.
- For providers of general-purpose AI models: fines not exceeding 3 % of their annual total worldwide turnover in the preceding financial year or EUR 15 000 000, whichever is higher.
Effective Date:
- Obligations for general-purpose AI systems: August 2, 2025
- Most obligations: August 2, 2026.
EU DMA
Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act.)
Highlights
Subject Matter and Scope:
- Purpose: contribute to the proper functioning of the internal market by laying down harmonised rules ensuring for all businesses, contestable and fair markets in the digital sector across the Union where gatekeepers are present, to the benefit of business users and end users.'
- Applies to: core platform services provided or offered by gatekeepers to business users established in the Union or end users established or located in the Union, irrespective of the place of establishment or residence of the gatekeepers and irrespective of the law otherwise applicable to the provision of service.
Main Definitions:
- Core platform service: any of the following:
- online intermediation services
- online search engines
- online social networking services
- video-sharing platform services
- number-independent interpersonal communications services
- operating systems
- web browsers
- virtual assistants
- cloud computing services
- online advertising services, including any advertising networks, advertising exchanges and any other advertising intermediation services, provided by an undertaking that provides any of the core platform services listed in points (a) to (i).
- Gatekeeper: an undertaking providing core platform services, designated pursuant to Article 3 of the DMA Regulation. An undertaking shall be presumed to be a gatekeeper if:
- it achieves an annual Union turnover equal to or above EUR 7,5 billion in each of the last three financial years, or where its average market capitalisation or its equivalent fair market value amounted to at least EUR 75 billion in the last financial year, and it provides the same core platform service in at least three Member States;
- it provides a core platform service that in the last financial year has at least 45 million monthly active end users established or located in the Union and at least 10 000 yearly active business users established in the Union, identified and calculated in accordance with the methodology and indicators set out in the Annex; and
- If the threshold in point (b) was met in each of the last three financial years.
- Online intermediation services: online intermediation services as defined in Article 2, point (2), of Regulation (EU) 2019/1150, i.e. services which meet all of the following requirements:
- they constitute information society services within the meaning of point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council;
- they allow business users to offer goods or services to consumers, with a view to facilitating the initiating of direct transactions between those business users and consumers, irrespective of where those transactions are ultimately concluded;
- they are provided to business users on the basis of contractual relationships between the provider of those services and business users which offer goods or services to consumers.
Main Provisions:
- Obtaining consent, as defined by the GDPR, from final users in the following situations:
- process, for the purpose of providing online advertising services, personal data of end users using services of third parties that make use of core platform services of the gatekeeper;
- combine personal data from the relevant core platform service with personal data from any further core platform services or from any other services provided by the gatekeeper or with personal data from third-party services;
- cross-use personal data from the relevant core platform service in other services provided separately by the gatekeeper, including other core platform services, and vice versa; and
- sign in end users to other services of the gatekeeper in order to combine personal data.
- Note: where the consent given for the purposes of the first subparagraph has been refused or withdrawn by the end user, the gatekeeper shall not repeat its request for consent for the same purpose more than once within a period of one year.
- Legal bases
- Legal bases for gatekeepers processing personal data are limited to user consent, legal obligations, vital interests or tasks in the public interest.
- Prohibition
- The gatekeeper must not use, in competition with business users, any data that is not publicly available that is generated or provided by those business users in the context of their use of the relevant core platform services or of the services provided together with, or in support of, the relevant core platform services, including data generated or provided by the customers of those business users.
- Portability of data
- The gatekeeper must provide end users and third parties authorised by an end user, at their request and free of charge, with effective portability of data provided by the end user or generated through the activity of the end user in the context of the use of the relevant core platform service, including by providing, free of charge, tools to facilitate the effective exercise of such data portability, and including by the provision of continuous and real-time access to such data.
- Data access
- The gatekeeper must provide business users and third parties authorised by a business user, at their request, free of charge, with effective, high-quality, continuous and real-time access to, and use of, aggregated and non-aggregated data, including personal data, that is provided for or generated in the context of the use of the relevant core platform services or services provided together with, or in support of, the relevant core platform services by those business users and the end users engaging with the products or services provided by those business users.
- With regard to personal data, the gatekeeper must provide for such access to, and use of, personal data only where the data are directly connected with the use effectuated by the end users in respect of the products or services offered by the relevant business user through the relevant core platform service, and when the end users opt in to such sharing by giving their consent.
- The gatekeeper is obligated to provide to any third-party undertaking providing online search engines, at its request, with access on fair, reasonable and non-discriminatory terms to ranking, query, click and view data in relation to free and paid search generated by end users on its online search engines. Any such query, click and view data that constitute personal data shall be anonymized.
- Interoperability
- Where a gatekeeper provides number-independent interpersonal communications services that are listed in the designation decision pursuant to the DMA Regulation it must make the basic functionalities of its number-independent interpersonal communications services (messaging, sharing files, calling, etc.) interoperable with the number-independent interpersonal communications services of another provider offering or intending to offer such services in the Union, by providing the necessary technical interfaces or similar solutions that facilitate interoperability, upon request, and free of charge.
- Transparency
- Within 6 months after its designation, a gatekeeper shall submit to the Commission an independently audited description of any techniques for profiling of consumers that the gatekeeper applies to or across its core platform services listed in the designation decision. The Commission shall transmit that audited description to the European Data Protection Board.
- The gatekeeper shall make publicly available an overview of the audited description referred to in paragraph 1. In doing so, the gatekeeper shall be entitled to take account of the need to respect its business secrets. The gatekeeper shall update that description and that overview at least annually.
Penalties:
- Fines: of up to 10% of the company’s total worldwide annual turnover, or up to 20% in the event of repeated infringements by essential entities shall not exceed EUR 10 million or 2% of global annual revenue, whichever is higher.
- Periodic penalty payments: of up to 5% of the average daily turnover.
Effective Date:
March 6, 2024.
EU DSA
Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act.)
Highlights
Subject Matter:
- Aim: contribute to the proper functioning of the internal market for intermediary services by setting out harmonised rules for a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter, including the principle of consumer protection, are effectively protected
- The DSA regulation lays down harmonised rules on the provision of intermediary services in the internal market. In particular, it establishes:
- a framework for the conditional exemption from liability of providers of intermediary services;
- rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services;
- rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities.
Scope:
- This Regulation shall apply to intermediary services offered to recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services have their place of establishment.
Key Definitions:
- Consumer: means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;
- Intermediary service: one of the following information society services: one of the following information society services
- a ‘mere conduit’ service, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
- a ‘caching’ service, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;
- a ‘hosting’ service, consisting of the storage of information provided by, and at the request of, a recipient of the service
- Online platform: a hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation;
- Profiling: any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements;
- Very large online platforms and very large online search engines: platforms and online search engines which have a number of average monthly active recipients of the service in the EU equal to or higher than 45 million, and which are designated by the Commission via a decision.
Main Provision Related to Data Protection:
- Transparency
- Targeted advertising: providers of online platforms that present advertisements on their online interfaces shall ensure that, for each specific advertisement presented to each individual recipient, the recipients of the service are able to identify, in a clear, concise and unambiguous manner and in real time, the following:
- that the information is an advertisement, including through prominent markings, which might follow standards pursuant to Article 44;
- the natural or legal person on whose behalf the advertisement is presented;
- the natural or legal person who paid for the advertisement if that person is different from the natural or legal person referred to in point ii;
- meaningful information directly and easily accessible from the advertisement about the main parameters used to determine the recipient to whom the advertisement is presented and, where applicable, about how to change those parameters.
- Note: providers of online platforms are prohibited from presenting advertisements to recipients of the service based on profiling using special categories of personal data as defined by the GDPR.
- Recommender system: Providers of online platforms that use recommender systems shall set out in their terms and conditions, in plain and intelligible language, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters.
- Targeted advertising: providers of online platforms that present advertisements on their online interfaces shall ensure that, for each specific advertisement presented to each individual recipient, the recipients of the service are able to identify, in a clear, concise and unambiguous manner and in real time, the following:
- Online protection of minors
- Providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service.
- Providers of online platform shall not present advertisements on their interface based on profiling using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor.
- Risk assessment and mitigation of risks
- Risk assessment
- Providers of very large online platforms and of very large online search engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services. They shall carry out the risk assessments at least once every year. This risk assessment shall be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, including any actual or foreseeable negative effects on the right to privacy.
- When conducting risk assessments, providers of very large online platforms and of very large online search engines shall take into account:
- the design of their recommender systems and any other relevant algorithmic system
- their content moderation systems;
- the applicable terms and conditions and their enforcement;
- systems for selecting and presenting advertisements;
- data related practices of the provider.
- Mitigation of risks
- Providers of very large online platforms and of very large online search engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified. Article 35(1) lists the types of measures applicable.
- Data access to researchers
- Upon a reasoned request from the Digital Services Coordinator of establishment, providers of very large online platforms or of very large online search engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements set in Article 40 of the DSA, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the EU, and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures.
- Risk assessment
Penalties:
The Commission can:
- Apply fines up to 6% of the worldwide annual turnover in case of breach of DSA obligations, failure to comply with interim measures, or breach of commitments.
- Apply periodic penalties up to 5% of the average daily worldwide turnover for each day or delay in complying with remedies, interim measures, or commitments.
- If the infringement persists, the Commission can request the temporary suspension of the service.
Effective Date:
February 17, 2024