DORA Regulation at a Glance

Regulation (EU) 2022/2554 — the Digital Operational Resilience Act. A binding EU regulation that requires financial entities to ensure they can withstand, respond to, and recover from ICT-related disruptions and threats.

64
Total Articles
5
Core Pillars
21
Entity Types in Scope
17 Jan 2025
Application Date

The Five Pillars

DORA is structured around five interconnected pillars. Think of them as the security domains you already work with, but with explicit regulatory teeth.

🛡

Pillar 1: ICT Risk Management

Your risk management framework, policies, asset inventory, detection, response, recovery, and lessons learned. Essentially your ISMS on steroids.

Articles 5–16
🚨

Pillar 2: Incident Reporting

Classify, manage, and report ICT incidents to regulators. Strict timelines. Think SOC + compliance reporting baked into law.

Articles 17–23
🔧

Pillar 3: Resilience Testing

Vulnerability assessments, pen tests, and Threat-Led Penetration Testing (TLPT). Red teaming mandated by law for significant entities.

Articles 24–27
🔗

Pillar 4: Third-Party Risk

Manage your supply chain. Contractual requirements for ICT providers, concentration risk, and an EU oversight framework for critical providers.

Articles 28–44
🤝

Pillar 5: Information Sharing

Voluntary threat intelligence sharing between financial entities. ISACs and CTI exchanges built into the regulatory framework.

Article 45

Why Should a Security Engineer Care?

It's Not Optional

Unlike frameworks (ISO 27001, NIST CSF), DORA is a regulation. Non-compliance means fines and regulatory action, not just a failed audit.

It Mandates Your Toolbox

SIEM, SOAR, vulnerability management, pen testing, BCP/DR, vendor risk — all explicitly required. Your security program IS the compliance program.

Board-Level Accountability

The management body is directly responsible. This gives security teams real leverage to push for resources and get executive buy-in.

Third-Party Scrutiny

Cloud providers, SaaS vendors, managed services — all under the microscope. You'll need to assess, contract, and monitor every critical ICT provider.

DORA vs. Frameworks You Already Know

AspectDORAISO 27001NIST CSFNIS2
TypeEU Regulation (binding)Voluntary standardVoluntary frameworkEU Directive (transposed)
SectorFinancial servicesAnyAnyEssential & important entities
Pen TestingMandatory TLPT for significant entitiesRecommendedRecommendedEncouraged
Incident ReportingMandatory, strict timelinesInternal processInternal process24h early warning, 72h report
Third-PartyDetailed contractual & oversight rulesAnnex A controlsSupply chain categorySupply chain security
Board AccountabilityExplicit personal liabilityManagement commitmentGovernance functionManagement liability

The Five Pillars — Deep Dive

Select a pillar from the sidebar or filter below to focus on a specific domain.

Pillar 1: ICT Risk Management Framework (Articles 5–16)

This is the backbone of DORA. It requires financial entities to have a comprehensive, documented ICT risk management framework that covers identification, protection, detection, response, and recovery. If you've built a security program based on NIST CSF or ISO 27001, this maps closely — but DORA adds prescriptive detail and makes it legally binding.

Key Requirements Mapped to Security Controls

Click any row for detailed context, obligations, and external references.

DORA RequirementWhat It Means for YouTools / Controls
ICT risk management framework (Art. 6)Documented framework, reviewed annually, approved by management bodyGRC platform, policy management
ICT systems inventory (Art. 8)Complete asset inventory of all ICT systems and information assetsCMDB, asset discovery tools
Identification of risks (Art. 8)Regular risk assessments, threat modeling, vulnerability identificationRisk register, threat modeling tools
Protection & prevention (Art. 9)IAM, encryption, network security, secure SDLC, patch managementIAM, WAF, SAST/DAST, patching tools
Detection (Art. 10)Anomaly detection, continuous monitoring, multiple layers of controlSIEM, EDR, NDR, UEBA
Response & recovery (Art. 11)IR plans, BCP, DR plans, communication proceduresSOAR, BCP/DR tools, runbooks
Backup & restoration (Art. 12)Backup policies, regular testing of restoration, separate from productionBackup solutions, DR drills
Learning & evolving (Art. 13)Post-incident reviews, lessons learned fed back into frameworkPIR process, knowledge base
Communication (Art. 14)Crisis communication plans, designated spokesperson, internal notification proceduresComms templates, contact lists
Simplified framework (Art. 16)Proportionate requirements for smaller entities (microenterprises etc.)Reduced documentation burden
Security Engineer Takeaway: Your existing ISMS likely covers 60–80% of this. The gaps are usually in formal documentation rigor, management body sign-off, annual review cycles, and the level of asset inventory granularity DORA expects. Focus your gap analysis on Articles 8 (identification) and 12 (backup testing) — these tend to be the weakest areas.

Pillar 2: ICT-related Incident Management (Articles 17–23)

Formalises your incident response process and adds mandatory reporting to regulators. This isn't just "have an IR plan" — DORA prescribes classification criteria, reporting timelines, and post-incident reporting.

Incident Classification Criteria

DORA defines major incidents based on these factors:

CriterionDescription
Clients affectedNumber of clients / financial counterparts impacted
Reputational impactSeverity of potential reputational damage
DurationHow long the incident persisted including service downtime
Geographic spreadNumber of Member States affected
Data lossesVolume and sensitivity of data compromised
Criticality of servicesWhether critical or important functions were affected
Economic impactDirect and indirect financial costs

Reporting Timeline

T + 4 hours (max 24h)
Initial Notification
Notify competent authority that a major ICT incident has occurred. Basic details: what happened, initial impact assessment, when it was detected.
T + 72 hours
Intermediate Report
Updated classification, root cause (if known), mitigation actions taken, estimated recovery timeline.
T + 1 month
Final Report
Full root cause analysis, total impact, remediation completed, lessons learned, and measures to prevent recurrence.
Security Engineer Takeaway: If you're running a SOC, your incident response playbooks need a "DORA reporting" branch. When triage identifies a major incident, the clock starts. You need automated severity classification that maps to DORA criteria, pre-built report templates, and a clear escalation path to the person who submits to the competent authority. Also note: significant cyber threats (even if no incident occurred) may also need to be reported voluntarily (Art. 19).

Pillar 3: Digital Operational Resilience Testing (Articles 24–27)

This is where DORA gets very hands-on for security engineers. It mandates a testing programme with specific requirements that scale with the entity's significance.

Testing Requirements by Tier

Click any row for detailed context, obligations, and external references.

Test TypeWhoFrequencyDetails
Vulnerability assessmentsAll financial entitiesAt least annuallyScans of all ICT systems and applications supporting critical functions
Open-source software analysisAll financial entitiesOngoingReview of OSS components, known vulnerabilities, license compliance
Network security assessmentsAll financial entitiesAt least annuallyNetwork architecture review, segmentation testing, firewall rule review
Scenario-based testingAll financial entitiesAt least annuallyTable-top exercises, business continuity simulations
Penetration testingAll (except microenterprises)At least annuallyApplication and infrastructure pen testing
Source code reviewWhere feasibleAs appropriateStatic analysis, code review of critical systems
TLPT (Threat-Led Pen Testing)Significant entities onlyEvery 3 yearsRed team exercises based on real threat intelligence, following TIBER-EU framework

TLPT — Threat-Led Penetration Testing (Art. 26–27)

The crown jewel for red teamers. DORA mandates TLPT for entities identified by competent authorities as significant. Key rules:

  • Must cover critical or important functions running in production
  • Threat intelligence must be gathered from multiple sources to build realistic attack scenarios
  • External testers must be used (with some exceptions for internal teams)
  • Testers must be certified / accredited, carry professional indemnity insurance
  • Critical ICT third-party providers must be included in scope (pooled testing allowed)
  • Results must be validated by the competent authority
  • Remediation plans must be established and tracked to completion
Security Engineer Takeaway: Annual pen tests and vuln scans are table stakes. If your org is deemed significant, budget for TLPT every 3 years — these are full-scope red team ops using real threat intel (think TIBER-EU). Start building relationships with qualified TLPT providers now. Also, your third-party cloud/SaaS providers may need to participate, which has contractual implications. The testing results feed directly back into Pillar 1 (risk framework updates) and Pillar 2 (incident response improvements).

Pillar 4: ICT Third-Party Risk Management (Articles 28–44)

The largest chapter in DORA. This regulates how financial entities manage risk from ICT service providers (cloud, SaaS, outsourced infra, managed services). It also establishes an EU-level oversight framework for "critical" ICT third-party providers (think AWS, Azure, Google Cloud, Bloomberg, etc.).

Key Obligations for Financial Entities

Strategy & Policy (Art. 28)

All financial entities
  • Adopt a strategy on ICT third-party risk
  • Maintain a register of all ICT contractual arrangements
  • Report the register to competent authorities annually
  • Inform authorities before contracting for critical/important functions

Due Diligence (Art. 28)

All financial entities
  • Assess ICT concentration risk before entering contracts
  • Evaluate provider's ability to meet SLAs
  • Verify adequate security measures exist
  • Assess subcontracting chains

Mandatory Contract Clauses (Art. 30)

All ICT service contracts
  • Clear service descriptions and SLAs
  • Data locations (processing & storage)
  • Audit and access rights for the entity & regulators
  • Incident notification obligations
  • Termination rights and exit strategies
  • BCM participation requirements

Critical/Important Functions (Art. 30)

Contracts supporting critical functions
  • All standard clauses plus:
  • Full service level descriptions with precise quantitative targets
  • Provider must assist during ICT incidents at no extra cost
  • Exit strategies with adequate transition periods
  • Provider participation in TLPT testing
  • Unrestricted right to monitor provider performance

EU Oversight Framework for Critical Third-Party Providers (Art. 31–44)

The ESAs (EBA, ESMA, EIOPA) can designate ICT providers as "critical" based on systemic importance. Once designated:

  • A Lead Overseer is appointed from the ESAs
  • The provider must undergo regular assessments
  • The Lead Overseer can issue recommendations and, if not followed, request financial entities to suspend or terminate use
  • Non-EU providers must establish a subsidiary within the EU (or the oversight framework won't apply directly)
Security Engineer Takeaway: This pillar will dominate your vendor management workload. Start with: (1) Build a complete register of every ICT provider/service. (2) Classify which support critical/important functions. (3) Review all contracts against Art. 30 requirements — most will need amendments. (4) Assess concentration risk: if three critical functions all run on AWS, that's a concentration risk you need to document and mitigate. (5) Ensure you have audit rights and the provider will participate in your testing programme.

Pillar 5: Information Sharing (Article 45)

The shortest pillar, but strategically important. DORA explicitly encourages (but does not mandate) financial entities to share cyber threat intelligence with each other.

What's Covered

  • Exchange of cyber threat intelligence and information (TTPs, IoCs, alerts, configuration tools)
  • Sharing within trusted communities of financial entities
  • Must protect personal data (GDPR compliance), business confidentiality, and competition law
  • Entities must notify competent authorities of their participation in sharing arrangements
Security Engineer Takeaway: If you're not already in a financial sector ISAC or threat intel sharing group (FS-ISAC, etc.), DORA gives you regulatory backing to push for membership. This is the "we should be doing this and now the regulation agrees" lever. Operationally: set up STIX/TAXII feeds, contribute to and consume from trusted sharing communities, and make sure your sharing agreements have proper data handling provisions.

Article Explorer

All 64 articles of DORA, grouped by chapter. Click to expand details, obligations, and practical security notes.

Chapter I — General Provisions

Art. 1 Subject Matter Standard

Defines what DORA covers: ICT risk management, incident reporting, resilience testing, third-party risk, and information sharing for financial entities.

Key Points

  • Establishes uniform requirements for the security of network and information systems
  • Covers ICT risk management, major incident reporting, testing, third-party risk, and information sharing
  • Applies to financial entities and ICT third-party service providers
Art. 2 Definitions Important

57 definitions that DORA uses. Key ones you'll need:

Critical Definitions

  • Digital operational resilience: The ability to build, assure, and review the operational integrity and reliability of an entity's ICT systems
  • ICT risk: Any reasonably identifiable risk related to ICT that, if materialized, may compromise the security of network and information systems
  • ICT-related incident: An unplanned event or series of events compromising the security of ICT systems with an adverse impact on availability, authenticity, integrity, or confidentiality
  • Major ICT-related incident: An incident with a high adverse impact on network/information systems supporting critical or important functions
  • Critical or important function: A function whose disruption would materially impair financial performance, service continuity, or compliance
  • ICT third-party service provider: An undertaking providing ICT services (digital & data services through ICT systems on an ongoing basis)
  • ICT concentration risk: Exposure to individual or highly correlated critical ICT third-party providers creating dependence such that unavailability could endanger the entity
  • Threat-led penetration testing (TLPT): A framework mimicking tactics, techniques, and procedures of real-life threat actors perceived as genuine cyber threats
Security Engineer Takeaway: Bookmark these definitions. "Critical or important function" is the classification you'll use constantly — it determines which requirements apply at an enhanced level. "ICT concentration risk" is the new one that will drive your cloud strategy discussions.
Art. 3 Scope — Entities Covered Standard

Lists all 21 types of financial entities in scope. See the "Entities" section for the full breakdown.

In Scope

  • Credit institutions (banks)
  • Payment institutions & e-money institutions
  • Investment firms & trading venues
  • Insurance & reinsurance undertakings
  • Central counterparties & central securities depositories
  • Crypto-asset service providers
  • Trade repositories, data reporting service providers
  • Management companies, AIFMs
  • Crowdfunding service providers
  • ICT third-party service providers (under oversight framework)
  • And more...
Art. 4 Principle of Proportionality Important

Requirements are proportionate to the entity's size, risk profile, nature, scale, and complexity of services. This is your friend — smaller or less complex entities can implement simpler controls.

What This Means

  • Implementation should match the entity's size, risk profile, and complexity
  • Microenterprises get a simplified ICT risk management framework (Art. 16)
  • TLPT only applies to entities designated as significant by competent authorities
  • Proportionality does NOT mean "optional" — the core obligations still apply
Security Engineer Takeaway: Use proportionality strategically. If your entity is smaller, document why certain heavy-lift requirements are met via proportionate measures. If you're a large bank, don't expect proportionality to buy you much slack — regulators will expect the full programme.

Chapter II — ICT Risk Management

Art. 5 Governance and Organisation Critical

The management body (board of directors, executive board) bears ultimate responsibility for ICT risk management. This is personal, not delegable.

Management Body Must

  • Define, approve, oversee, and be accountable for the ICT risk management framework
  • Define roles and responsibilities for all ICT-related functions
  • Set and approve the digital operational resilience strategy (at least annually)
  • Approve and review ICT business continuity & DR policies
  • Approve and review ICT audit plans and internal audit activities
  • Allocate adequate budget for ICT security awareness and digital operational resilience training
  • Members must maintain sufficient knowledge and skills on ICT risk (with regular training)
Security Engineer Takeaway: This is your biggest lever. The board is personally accountable. When you need budget, headcount, or project priority, frame it as: "The management body is legally accountable under DORA Art. 5 and needs to demonstrate active oversight." Prepare board-level reporting that shows risk posture, open findings, and compliance gaps.
Art. 6 ICT Risk Management Framework Critical

The core requirement: maintain a comprehensive, documented, and regularly updated ICT risk management framework.

Requirements

  • Shall include strategies, policies, procedures, protocols, and tools necessary for ICT risk management
  • Must be documented and reviewed at least annually (and after major incidents)
  • Must be audited by ICT auditors at regular intervals
  • Must incorporate a digital operational resilience strategy with methods to address ICT risk
  • The strategy must include how the entity's ICT risk management framework is implemented
  • Must set the risk tolerance level for ICT risk
  • Must include ICT third-party risk strategy
Security Engineer Takeaway: This is essentially requiring a documented security programme. If you have an ISMS (ISO 27001), you have a head start. Key gap: DORA explicitly requires a digital operational resilience strategy — a document that covers how you'll maintain operations during ICT disruptions. This goes beyond a standard InfoSec policy; it's more aligned with operational resilience / BCP.
Art. 7 ICT Systems, Protocols and Tools Critical

Requirements

  • Use and maintain updated ICT systems, protocols, and tools that are reliable, have sufficient capacity, and are technologically resilient
  • Must ensure availability, authenticity, integrity, and confidentiality of data
  • Systems must be designed to be securely developed, deployed, maintained
Art. 8 Identification Critical

Know what you have, know what can go wrong, and document it all.

Requirements

  • Identify, classify, and document all ICT-supported business functions, roles, and responsibilities
  • Identify all sources of ICT risk including those from ICT third-party providers
  • Maintain an inventory of all information assets and ICT assets (including hardware, software, network resources)
  • Identify all ICT systems that support critical or important functions
  • Map interconnections and dependencies between ICT assets, systems, processes, and providers
  • Perform risk assessments upon major changes to the ICT infrastructure
  • Maintain network architecture documentation updated at all times
Security Engineer Takeaway: This is your CMDB on steroids. You need a complete inventory of: every server, app, database, network segment, cloud service, SaaS tool, and their interdependencies. The dependency mapping is often the hardest part — consider using automated discovery tools. Pro tip: also map which ICT provider supports which function — you'll need this for Art. 28's register.
Art. 9 Protection and Prevention Critical

Requirements

  • Develop and implement ICT security policies covering: information security, network security, access control, authentication, change management, patching, encryption, physical security
  • Implement IAM policies: least privilege, need-to-know, segregation of duties
  • Strong authentication mechanisms (MFA where appropriate)
  • ICT change management procedures (documented, tested, approved)
  • Patch management policies with appropriate timelines
  • Encryption policies for data at rest and in transit
  • Secure software development lifecycle (SDLC)
Security Engineer Takeaway: This maps directly to your standard security controls. Review your existing policies against each bullet above. Common gaps: formal ICT change management procedures (not just ITIL ticket flows), documented encryption standards, and a fully codified SDLC with security gates.
Art. 10 Detection Critical

Requirements

  • Establish mechanisms to promptly detect anomalous activities (network performance issues, ICT-related incidents)
  • Detection mechanisms must enable multi-layer control, define alert thresholds, and trigger incident response processes
  • Adequate resources for monitoring and analyzing ICT threats and incidents
  • For entities performing TLPT: detection capabilities should be tested as part of the TLPT scope
Security Engineer Takeaway: SIEM, EDR, NDR — you need all three layers. The regulation expects "multi-layer control" which means network + endpoint + application detection. Make sure your detection rules cover the DORA incident classification criteria so you can automatically flag potential major incidents.
Art. 11 Response and Recovery Critical

Requirements

  • Establish ICT business continuity policy with plans for all critical systems and functions
  • Develop and implement ICT response and recovery plans
  • Estimate recovery time and recovery point objectives
  • Switch to backup systems with minimal disruption
  • Plans must consider different scenarios including cyber attacks and infrastructure failures
  • Regular testing of BCP/DR plans (at least annually)
  • Crisis communication procedures for internal and external stakeholders
  • Dedicated crisis management function for major incidents
Security Engineer Takeaway: RTOs and RPOs must be documented for every critical function. BCP/DR plans must include cyber attack scenarios (ransomware, data destruction, cloud outage). Test annually at minimum. If you're only doing tabletop exercises, that may not be enough — DORA expects actual switchover tests.
Art. 12 Backup Policies and Procedures Critical

Requirements

  • Develop and implement backup and restoration policies and procedures
  • Backup scope and frequency aligned with criticality of the function
  • Backups must be physically and logically separated from the source ICT system
  • When restoring, backup systems must not directly connect to production until verified
  • Regular testing of backup and restoration procedures
  • For ICT systems supporting critical functions: backup must support recovery within the defined RTO/RPO
Security Engineer Takeaway: Air-gapped or immutable backups are implicitly required by the physical/logical separation mandate. Test your restore procedures regularly — not just "does the backup job complete" but "can we actually rebuild the system from this backup." Think ransomware resilience.
Art. 13 Learning and Evolving Important

Requirements

  • Gather information from all ICT-related incidents and cyber threats
  • Conduct post-incident reviews after major ICT disruptions
  • Identify root causes and establish improvements
  • Feed lessons into the ICT risk management framework updates
  • Monitor effectiveness of the digital operational resilience strategy implementation
  • Mandatory ICT security awareness and digital operational resilience training for all staff
  • Monitor developments in ICT risk, technological developments, and threats
Security Engineer Takeaway: Post-incident reviews (PIRs) are mandatory, not optional. Build a PIR template that feeds findings into: (1) risk register updates, (2) detection rule improvements, (3) playbook updates, and (4) training programme adjustments. DORA also mandates security awareness training for all staff, including the management body.
Art. 14 Communication Important

Requirements

  • Communication policies for internal staff and external stakeholders
  • At least one designated person for handling communications during incidents
  • Define communication plans for both ongoing incidents and post-resolution
  • Must have policies on disclosure to clients (especially when incidents affect their services or data)
Art. 15 Further Harmonisation of ICT Risk Management Tools, Methods, Processes, and Policies Standard

Empowers the ESAs to develop Regulatory Technical Standards (RTS) further specifying the ICT risk management requirements. These RTS provide the granular technical detail beneath the principles in Articles 5–14.

Art. 16 Simplified ICT Risk Management Framework Important

Provides a lighter-touch framework for smaller entities (small and non-interconnected investment firms, payment institutions exempted under PSD2, etc.).

Simplified Requirements

  • Still need a sound and documented ICT risk management framework
  • Simplified documentation and formal requirements
  • Still need to monitor and review the framework annually
  • Minimise impact of ICT risk through the use of sound, resilient, and updated ICT systems
  • Identify critical functions and key dependencies

Chapter III — ICT-related Incident Management, Classification, and Reporting

Art. 17 ICT-related Incident Management Process Critical

Requirements

  • Define, establish, and implement an ICT-related incident management process
  • Put in place early warning indicators
  • Establish procedures to identify, track, log, categorise, and classify incidents
  • Assign roles and responsibilities for different incident types and scenarios
  • Plans for communication to staff, external stakeholders, media, and clients
  • Report major incidents to senior management; inform management body of impact and response
  • Establish incident response procedures to mitigate impact and ensure timely restoration
Security Engineer Takeaway: Your SIRP (Security Incident Response Plan) needs to integrate DORA requirements explicitly. Add a "DORA classification" step to your triage workflow. Every incident ticket should capture the classification criteria (clients affected, duration, data loss, etc.) so you can quickly determine if it crosses the "major incident" threshold.
Art. 18 Classification of ICT-related Incidents and Cyber Threats Critical

Classification Criteria for Incidents

  • Number of clients/counterparts affected
  • Reputational impact
  • Duration and service downtime
  • Geographic spread across Member States
  • Data losses (availability, authenticity, integrity, confidentiality)
  • Criticality of services affected
  • Economic impact (direct and indirect costs)

Classification Criteria for Cyber Threats

  • Criticality of services at risk
  • Number of clients/counterparts potentially affected
  • Geographic spread of areas at risk
Security Engineer Takeaway: Build these criteria into your SIEM correlation rules. If an alert fires on a system supporting 10,000+ clients, it automatically gets flagged for DORA major incident assessment. Create a scoring matrix that maps these criteria to your severity levels.
Art. 19 Reporting of Major ICT-related Incidents and Voluntary Reporting of Significant Cyber Threats Critical

Mandatory Reporting

  • Initial notification: Submit to competent authority after classifying incident as major (within 4 hours, maximum 24 hours)
  • Intermediate report: Within 72 hours — updated impact, root cause status, mitigation actions
  • Final report: Within 1 month — full root cause, total impact, remediation measures, lessons learned

Voluntary Reporting

  • Entities may voluntarily notify significant cyber threats they consider relevant (even if no incident occurred)

Clients

  • When a major incident has an impact on clients' financial interests, the entity must inform them without undue delay
  • Must inform clients of measures taken to mitigate the adverse effects
Security Engineer Takeaway: Pre-build your report templates. Have three templates (initial, intermediate, final) ready with all required fields. Automate data population where possible (from SIEM, ticketing). The 4-hour initial notification window is tight — your on-call process needs to include "DORA notification assessment" in the first 30 minutes of triage. Consider a SOAR playbook for this.
Art. 20 Harmonisation of Reporting Content and Templates Important

ESAs develop RTS specifying the content, timelines, and templates for incident reports. Standardised templates mean consistent reporting across the EU.

Art. 21 Centralisation of Reporting Standard

ESAs and ECB to explore the feasibility of establishing a single EU Hub for major ICT-related incident reporting.

Art. 22 Supervisory Feedback Important

Competent authorities must provide feedback and guidance to the reporting entity following their report. They may also share anonymised information about the incident with other entities to improve sector-wide resilience.

Art. 23 Operational or Security Payment-related Incidents Important

Special provisions for credit institutions, payment institutions, and e-money institutions regarding payment-related incidents. DORA's incident reporting replaces the PSD2 incident reporting for these entities.

Chapter IV — Digital Operational Resilience Testing

Art. 24 General Requirements for Digital Operational Resilience Testing Critical

Requirements

  • Establish, maintain, and review a sound and comprehensive digital operational resilience testing programme
  • Testing programme must be proportionate to the entity's size and risk profile
  • Include a range of assessments, tests, methodologies, practices, and tools
  • Follow a risk-based approach prioritising critical and important functions
  • Testing must be undertaken by independent parties (internal or external)
  • Establish procedures to prioritise, classify, and remedy findings
  • Ensure all identified issues are remediated or risk-accepted with appropriate approval
Security Engineer Takeaway: "Independent" means your own red team or an external firm — not the team that built the system. All findings need to be tracked to closure. Build this into your vulnerability management workflow: DORA finding → JIRA ticket → remediation → retest → closure documentation.
Art. 25 Testing of ICT Tools and Systems Critical

Required Tests

  • Vulnerability assessments and scans
  • Open-source analyses
  • Network security assessments
  • Gap analyses
  • Physical security reviews
  • Questionnaires and scanning software solutions
  • Source code reviews (where feasible)
  • Scenario-based tests
  • Compatibility testing
  • Performance testing
  • End-to-end testing
  • Penetration testing
Security Engineer Takeaway: This is a comprehensive test catalogue. You probably do many of these already, but document them all as part of your "DORA testing programme." The key addition many teams miss: open-source analysis (SCA tools for dependency vulnerabilities) and physical security reviews. Make sure you have evidence of each test type being performed at least annually.
Art. 26 Advanced Testing — Threat-Led Penetration Testing (TLPT) Critical

Requirements (for designated entities)

  • Carry out TLPT at least every 3 years
  • Competent authority identifies which entities must perform TLPT based on systemic impact, criticality, and ICT maturity
  • Must cover several or all critical or important functions
  • Must be performed on live production systems
  • Scope must include ICT services provided by third parties (with their involvement)
  • Entity must perform a risk management assessment and obtain approvals before testing
  • Testing scope, methodology, and results must be validated by the competent authority
  • Results and remediation plans must be attested by the competent authority
Security Engineer Takeaway: TLPT = TIBER-EU style red team assessment. This is a major undertaking: real threat intel drives the scenarios, testing happens in production, and the regulator validates results. If you're in scope, start planning 12+ months ahead. You'll need: threat intel provider, qualified red team (external), strong blue team documentation, and a robust risk management process for the testing itself (it's production, after all).
Art. 27 Requirements for TLPT Testers Critical

External Tester Requirements

  • Highest suitability and reputability
  • Technical and organisational capabilities for TLPT (specifically threat intelligence and penetration testing)
  • Certified by an accreditation body or adhere to formal codes of conduct/ethical frameworks
  • Provide independent assurance/audit reports on sound management of risks associated with TLPT
  • Covered by professional indemnity insurance

Internal Testers

  • May use internal testers but must also engage external testers for every third test
  • Competent authority may restrict internal testing based on specific conditions

Chapter V — Managing ICT Third-Party Risk

Art. 28 General Principles for Third-Party Risk Management Critical

Requirements

  • Financial entities remain fully responsible for compliance even when using ICT third-party providers
  • Must adopt and regularly review a strategy on ICT third-party risk
  • Maintain and update a register of information on all contractual arrangements with ICT providers
  • Distinguish between arrangements supporting critical/important functions and those that don't
  • Report the register to competent authorities at least annually
  • Inform competent authorities in a timely manner about planned new arrangements for critical/important functions
  • Before entering into contracts: identify and assess all relevant risks (including ICT concentration risk)
  • Conduct due diligence on prospective ICT providers
  • Only contract with providers that comply with appropriate information security standards
Security Engineer Takeaway: The register of ICT providers is a major deliverable. It must be comprehensive: every cloud service, SaaS tool, managed service, data provider. Flag which ones support critical/important functions. This register gets submitted to your regulator annually, so treat it like a living document. Consider a dedicated TPRM (Third-Party Risk Management) tool.
Art. 29 Preliminary Assessment of ICT Concentration Risk and Sub-outsourcing Important

Requirements

  • Assess whether entering into a contract would lead to ICT concentration risk
  • Weigh benefits and costs of alternative solutions
  • Assess whether sub-outsourcing conditions are complied with (chaining of providers)
  • Consider risks arising from the entity and the provider being in different jurisdictions
Art. 30 Key Contractual Provisions Critical

Mandatory Clauses for ALL ICT Contracts

  • Clear and complete description of all functions and services
  • Locations where data will be processed and stored (including subcontractors)
  • Data protection provisions including data access, recovery, and return/destruction on termination
  • Service level descriptions with precise quantitative and qualitative performance targets
  • Provider must assist during ICT incidents related to the service at no additional cost
  • Provider must participate in the entity's ICT security awareness programme
  • Obligation for the provider to implement and test BCP measures
  • Right to monitor the provider's performance on an ongoing basis (audit and access rights)
  • Termination rights with adequate notice periods
  • Cooperation of the provider with competent authorities

Additional Clauses for Critical/Important Functions

  • Full SLA targets covering availability, reliability, and response times
  • Notice periods and reporting obligations when provider developments may affect the service
  • Comprehensive exit strategies and transition plans
  • Provider must participate in the entity's TLPT programme
  • Provider must grant unrestricted rights of inspection and audit
  • Agreed-upon termination rights and minimum notice periods
Security Engineer Takeaway: This article will keep your legal team busy. Create a contract review checklist mapped to Art. 30 requirements. Every renewal is an opportunity to add missing clauses. Key items: audit rights (you need to be able to audit or inspect your cloud provider), data location transparency, and exit strategy. For cloud providers, check their shared responsibility models against these requirements.
Art. 31 Designation of Critical ICT Third-Party Service Providers Important

ESAs designate ICT third-party providers as "critical" based on criteria including:

  • Systemic impact on the financial sector if the provider fails
  • Number and type of financial entities relying on the provider
  • Degree of substitutability (how easy to replace)
  • Number of Member States where the provider's services are used
  • Degree of dependence of the financial entities on the provider's services
Art. 32–44 Oversight Framework for Critical Third-Party Providers Standard

Articles 32–44 establish the EU-level oversight framework for critical ICT third-party providers. Key elements:

Oversight Structure

  • Lead Overseer (Art. 33): Appointed from EBA, ESMA, or EIOPA based on which financial sector uses the provider most
  • Powers (Art. 35): Request information, conduct on-site/off-site investigations, issue recommendations
  • General investigations (Art. 36): Launch investigations into the provider's operations
  • Inspections (Art. 37): Conduct on-site inspections of the provider
  • Follow-up (Art. 38): If recommendations not followed, can request financial entities to suspend/terminate
  • Costs (Art. 43): Oversight costs charged to the critical providers
  • Non-EU providers (Art. 31): Must establish a subsidiary in the EU to fall under oversight
Security Engineer Takeaway: You won't manage this framework directly, but you need to know: if your critical cloud provider gets a recommendation from the Lead Overseer and doesn't comply, your regulator could ask you to suspend or terminate the service. This is why exit strategies (Art. 30) matter — you need a viable plan B for every critical provider.

Chapter VI — Information-Sharing Arrangements

Art. 45 Information-Sharing Arrangements on Cyber Threat Information and Intelligence Important

Provisions

  • Financial entities may exchange cyber threat information and intelligence amongst themselves
  • Including indicators of compromise (IoCs), TTPs, security alerts, and configuration tools
  • Sharing must protect personal data (GDPR), business confidentiality, and competition law
  • Must be carried out within trusted communities
  • Sharing arrangements must define conditions for participation and handling of shared information
  • Entities must notify competent authorities of their participation in sharing arrangements
Security Engineer Takeaway: Join FS-ISAC or your national financial CERT's sharing programme if you haven't already. Set up STIX/TAXII-compatible ingestion in your threat intel platform. When sharing, use TLP (Traffic Light Protocol) markings. DORA gives you the legal foundation to tell management "the regulation encourages this" when advocating for threat intel sharing programme budget.

Chapter VII–IX — Competent Authorities, Delegated Acts, and Final Provisions

Art. 46–56 Competent Authorities and Cooperation Standard

Establishes the supervisory framework: which national competent authorities enforce DORA, how they cooperate across borders, and their powers for administrative penalties and remedial measures.

Key Points

  • Each Member State designates competent authorities for DORA supervision
  • Competent authorities have supervisory, investigatory, and sanctioning powers
  • Cooperation mechanisms between national authorities and ESAs
  • Exchange of information between supervisors
  • Administrative penalties and remedial measures defined by Member States (including criminal penalties where applicable)
Art. 57–64 Delegated Acts, Amendments, Review and Final Provisions Standard

Final provisions covering delegated acts (empowering the Commission to specify technical details), amendments to existing regulations, the review clause (3-year review), and the entry into force / application date.

Key Dates

  • DORA entered into force on 16 January 2023
  • Applies from 17 January 2025
  • Review by 17 January 2028 (Commission report on appropriateness of enhanced requirements for audit and IT services)

Obligation Matrix

Key obligations grouped by who is responsible and what they need to do. Mapped to specific articles.

Management Body Obligations

Define & Approve ICT Risk Framework

Art. 5(2) • Management Body
  • Approve and oversee the ICT risk management framework
  • Review and update at least annually
  • Bear ultimate responsibility for compliance

Set Digital Resilience Strategy

Art. 6(8) • Management Body
  • Define the digital operational resilience strategy
  • Include methods to address ICT risk and objectives
  • Explain how the framework supports business strategy

Allocate ICT Budget

Art. 5(2)(b) • Management Body
  • Allocate and review adequate budget for digital operational resilience needs
  • Including ICT security awareness programmes and training

Maintain ICT Knowledge

Art. 5(4) • Board Members
  • Actively keep up to date with sufficient knowledge on ICT risk
  • Undertake regular specific training on ICT risks and their impact

Security / ICT Team Obligations

Asset Inventory & Mapping

Art. 8 • ICT / Security Team
  • Maintain complete ICT asset inventory (HW, SW, network, cloud)
  • Map all dependencies between systems, processes, and providers
  • Update after every major change

Implement Security Controls

Art. 9 • ICT / Security Team
  • IAM, MFA, encryption (at rest + transit)
  • Patch management, change management
  • Network security, SDLC, physical security

Run Detection Programme

Art. 10 • SOC / Security Team
  • Multi-layer detection (network, endpoint, application)
  • Define alert thresholds and trigger criteria
  • Maintain adequate monitoring resources

Incident Response & Reporting

Art. 17–19 • SOC / IR Team
  • Classify incidents against DORA criteria
  • Report major incidents: initial (4h), intermediate (72h), final (1mo)
  • Maintain report templates and escalation procedures

Execute Testing Programme

Art. 24–25 • Security Team
  • Annual vulnerability assessments and pen tests
  • Network assessments, OSS analysis, scenario testing
  • Track all findings to remediation

Manage TLPT (if designated)

Art. 26–27 • Security Team
  • Engage qualified external testers every 3 years
  • Test against live production systems
  • Include critical third-party providers in scope

BCP/DR Testing

Art. 11–12 • Security / IT Ops
  • Maintain and test BCP/DR plans annually
  • Document RTOs and RPOs for all critical functions
  • Test backup restoration regularly

Third-Party Risk Management

Art. 28–30 • Security / Procurement
  • Maintain register of all ICT provider contracts
  • Perform due diligence on providers
  • Review contracts against Art. 30 requirements
  • Assess ICT concentration risk

Reporting Obligations Summary

WhatTo WhomWhenArticle
Major incident — initial notificationCompetent authorityWithin 4 hours (max 24h) of classificationArt. 19
Major incident — intermediate reportCompetent authorityWithin 72 hoursArt. 19
Major incident — final reportCompetent authorityWithin 1 monthArt. 19
Register of ICT providersCompetent authorityAt least annuallyArt. 28(3)
New critical provider arrangementsCompetent authorityTimely (before contracting)Art. 28(3)
Information sharing participationCompetent authorityUpon entering arrangementArt. 45
TLPT resultsCompetent authorityAfter TLPT completionArt. 26
Client notification (major incidents)Affected clientsWithout undue delayArt. 19(3)

Key Dates & Timeline

Important milestones for DORA implementation.

24 September 2020
Commission Proposal
European Commission publishes the Digital Finance Package, including the DORA proposal.
10 May 2022
Political Agreement
Council and European Parliament reach provisional political agreement on DORA.
14 December 2022
Formal Adoption
DORA officially adopted as Regulation (EU) 2022/2554.
16 January 2023
Entry into Force
DORA enters into force. 24-month implementation period begins.
17 January 2024
First Batch of RTS/ITS Published
ESAs publish first set of Regulatory/Implementing Technical Standards for public consultation and adoption.
17 July 2024
Second Batch of RTS/ITS Published
ESAs publish second set of technical standards, including TLPT and ICT risk management details.
17 January 2025
Application Date (Enforcement Begins)
DORA becomes fully applicable. Financial entities must comply. Supervisory enforcement begins.
17 January 2026
Designation of Critical Third-Party Providers
ESAs expected to designate the first set of critical ICT third-party providers under the oversight framework.
17 January 2028
Commission Review
European Commission to review DORA and report on the appropriateness and need for enhanced requirements.

Compliance Checklist for Security Engineers

Practical checklist to assess your organisation's DORA readiness. Click items to mark them as done.

Pillar 1: ICT Risk Management

  • Documented ICT risk management framework exists and is approved by the management body
  • Digital operational resilience strategy documented and reviewed annually
  • Complete ICT asset inventory maintained (hardware, software, cloud, network)
  • Dependency mapping between ICT systems, processes, and third-party providers completed
  • Risk assessments performed regularly and after major changes
  • IAM policies implemented: least privilege, MFA, segregation of duties
  • Encryption policies for data at rest and in transit documented and implemented
  • Patch management programme in place with defined SLAs
  • Secure SDLC defined and enforced
  • Multi-layer detection deployed (SIEM + EDR + NDR)
  • BCP and DR plans documented for all critical functions
  • RTOs and RPOs defined and documented for all critical functions
  • Backup procedures tested (restore tested, not just backup completion)
  • Backups physically and logically separated from production
  • Post-incident review process established and feeding back into risk framework
  • ICT security awareness training programme for all staff (including board)
  • Crisis communication plan with designated spokesperson

Pillar 2: Incident Management & Reporting

  • ICT incident management process documented
  • Incident classification criteria aligned with DORA Art. 18 thresholds
  • Escalation procedures defined for major incidents
  • Initial notification template ready (4-hour deadline)
  • Intermediate report template ready (72-hour deadline)
  • Final report template ready (1-month deadline)
  • SOAR playbook includes DORA reporting workflow
  • Client notification process for incidents affecting their services
  • Significant cyber threat voluntary reporting process considered

Pillar 3: Resilience Testing

  • Documented digital operational resilience testing programme
  • Annual vulnerability assessments of all critical systems
  • Annual penetration testing
  • Open-source / SCA analysis integrated into CI/CD
  • Network security assessments performed annually
  • Scenario-based testing (tabletop + operational exercises) conducted
  • Finding remediation tracked to closure
  • TLPT assessment: confirmed whether entity is in scope
  • If TLPT in scope: qualified external tester identified/contracted
  • If TLPT in scope: threat intelligence provider identified for scenario development

Pillar 4: Third-Party Risk Management

  • Complete register of all ICT third-party provider contracts maintained
  • Critical/important functions mapped to supporting ICT providers
  • ICT concentration risk assessed and documented
  • Due diligence process defined for new ICT providers
  • Existing contracts reviewed against Art. 30 mandatory clauses
  • Audit rights confirmed in all critical provider contracts
  • Exit strategies documented for all critical provider arrangements
  • Data location transparency confirmed across all providers
  • Provider participation in testing programme contractually agreed
  • Annual provider register submitted to competent authority

Pillar 5: Information Sharing

  • Evaluated participation in financial sector ISAC or sharing community
  • Threat intelligence sharing mechanisms in place (STIX/TAXII capable)
  • Sharing agreements include GDPR and confidentiality provisions
  • Competent authority notified of sharing arrangement participation

Entities in Scope

DORA applies to 21 types of financial entities (Art. 2) and ICT third-party service providers under the oversight framework.

#Entity TypeExamplesSimplified Framework?
1Credit institutionsBanks, building societiesNo
2Payment institutionsPayment processors, PSPsSome (PSD2-exempted)
3Account information service providersOpen banking aggregatorsYes
4Electronic money institutionsE-money issuersSome
5Investment firmsBrokers, dealers, advisorsSmall & non-interconnected
6Crypto-asset service providersExchanges, wallet providersNo
7Central securities depositoriesCSDsNo
8Central counterpartiesCCPs, clearing housesNo
9Trading venuesStock exchanges, MTFs, OTFsNo
10Trade repositoriesEMIR trade repositoriesNo
11Management companies (UCITS)Fund managersNo
12Alternative investment fund managersHedge fund managers, PE managersNo
13Data reporting service providersARMs, APAs, CTPsNo
14Insurance undertakingsInsurance companiesNo
15Reinsurance undertakingsReinsurersNo
16Insurance intermediariesInsurance brokers (certain)Some
17Institutions for occupational retirementPension funds (IORPs)No
18Credit rating agenciesS&P, Moody's, FitchNo
19Administrators of critical benchmarksLIBOR/EURIBOR administratorsNo
20Crowdfunding service providersEU crowdfunding platformsNo
21Securitisation repositoriesSecuritisation data repositoriesNo
Note on ICT Third-Party Service Providers: While not "financial entities" per se, ICT service providers supporting critical functions of financial entities are in scope via the contractual requirements (Art. 28–30) and, if designated as critical, via the direct oversight framework (Art. 31–44). This includes cloud providers (AWS, Azure, GCP), core banking SaaS, market data providers, and managed security services.

Regulatory Technical Standards (RTS) & Implementing Technical Standards (ITS)

The ESAs (EBA, ESMA, EIOPA) developed detailed technical standards that specify the requirements of DORA. These are the granular "how-to" beneath the regulation's principles.

First Batch (Published January 2024)

RTS on ICT Risk Management Framework

Art. 15 • JC 2023 86
  • Detailed elements of the ICT risk management framework
  • Simplified framework for qualifying entities
  • Specifies policies, procedures, and documentation requirements

RTS on Classification of Major Incidents

Art. 18(3) • JC 2023 83
  • Materiality thresholds for classifying major incidents
  • Quantitative and qualitative criteria
  • Classification methodology

ITS on Reporting Templates

Art. 20 • JC 2023 84
  • Standard templates for initial, intermediate, and final reports
  • Data fields and format requirements

RTS on ICT Third-Party Register of Information

Art. 28(9) • JC 2023 85
  • Templates and format for the register of ICT provider contracts
  • Data points to be captured per contractual arrangement

Second Batch (Published July 2024)

RTS on TLPT

Art. 26(11) • JC 2024 29
  • Requirements for threat-led penetration testing
  • Tester qualifications and certification
  • Scope, methodology, and reporting requirements

RTS on ICT Third-Party Policy

Art. 28(10)
  • Elements of the ICT third-party risk strategy
  • Due diligence and monitoring requirements
  • Exit strategy specifications

RTS on Subcontracting Critical Functions

Art. 30(5)
  • Conditions for sub-outsourcing of critical or important functions
  • Monitoring requirements for subcontracting chains

RTS on Oversight Harmonisation

Art. 41
  • Information to be provided by critical third-party providers to the Lead Overseer
  • Conduct of oversight activities

Guidelines on Oversight Cooperation

Art. 32(7)
  • Cooperation between ESAs and competent authorities for the oversight framework

RTS on Criteria for Critical ICT Provider Designation

Art. 31(6)
  • Quantitative and qualitative criteria for designating providers as critical
  • Systemic importance assessment methodology

Enforcement & Penalties

DORA does not specify a single EU-wide penalty amount (unlike GDPR). Instead, it empowers Member States and competent authorities to impose administrative penalties and remedial measures.

Enforcement Powers

Competent Authorities (Art. 50–51)

National financial regulators
  • Supervisory powers: request information, conduct investigations, on-site inspections
  • Require entities to take specific remedial measures
  • Impose administrative penalties and periodic penalty payments
  • Issue public statements identifying the entity and the breach
  • Order cessation of conduct that breaches DORA

Penalties for Critical Third-Party Providers (Art. 35)

Lead Overseer (ESA)
  • Lead Overseer can impose periodic penalty payments on critical ICT providers
  • Up to 1% of average daily worldwide turnover in the preceding business year, per day
  • Penalties for up to 6 months
  • If recommendations not followed: request financial entities to suspend/terminate

Member State Penalties (Art. 50)

Defined at national level
  • Member States define the specific penalties in their national transposition
  • Must be effective, proportionate, and dissuasive
  • Some Member States may include criminal sanctions for natural persons
  • Personal liability for management body members possible
Security Engineer Takeaway: While DORA doesn't have a GDPR-style "4% of global turnover" headline number, the penalties are still significant. The bigger risk for most entities is the regulatory remedial measures: being ordered to suspend a critical ICT service, or having findings published publicly. For security teams, the key message to leadership is: non-compliance isn't just a fine risk, it's an operational risk (regulators can force changes to your infrastructure).

External Resources & References

Curated links to official documents, technical standards, guidance, and tools. All links open in a new tab.

Official Legislation

Official

Regulation (EU) 2022/2554 — Full Text

The complete DORA regulation text on EUR-Lex. The authoritative legal source. Available in all EU languages.

Primary Source
Official

Directive (EU) 2022/2556 — Amending Directive

The accompanying directive that amends existing financial services directives to align them with DORA (CRD, MiFID II, Solvency II, PSD2, etc.).

Companion Legislation
Official

European Commission — DORA Implementing & Delegated Acts

Commission page listing all delegated and implementing acts under DORA, including adopted Commission Delegated Regulations.

Delegated Acts
Official

European Commission — DORA Q&A

Official Q&A document from the Commission clarifying scope, definitions, and application of DORA requirements.

Clarifications

ESA Technical Standards (RTS / ITS)

ESA

EBA — DORA Regulatory Products

EBA's central page for all DORA regulatory products including RTS, ITS, and guidelines. Start here for the banking-sector perspective.

EBA Hub All Pillars
ESA

ESMA — DORA Hub

ESMA's dedicated DORA page with technical standards, Q&As, and resources relevant to investment firms, trading venues, and market infrastructure.

ESMA Hub All Pillars
ESA

EIOPA — DORA Page

EIOPA's DORA resources for the insurance and pensions sectors.

EIOPA Hub Insurance
ESA

RTS on ICT Risk Management Framework (Art. 15)

The detailed technical standard specifying ICT risk management framework requirements and the simplified framework. Covers policies, procedures, governance.

Pillar 1 Art. 5-16
ESA

RTS on Incident Classification Criteria (Art. 18)

Materiality thresholds and criteria for classifying ICT-related incidents as "major." Essential for configuring your SIEM classification rules.

Pillar 2 Art. 17-23
ESA

ITS on Incident Reporting Templates (Art. 20)

Standard forms and templates for initial notifications, intermediate reports, and final reports. Download these and integrate them into your IR process.

Pillar 2 Templates
ESA

RTS on Register of Information for ICT Third-Party Providers (Art. 28)

Template and data model for the register of ICT contractual arrangements. Defines exactly what fields you need to capture for each provider.

Pillar 4 Templates
ESA

RTS on Threat-Led Penetration Testing — TLPT (Art. 26)

TLPT methodology, tester qualifications, scope requirements, and reporting. Maps closely to TIBER-EU. Essential reading if your entity is TLPT-designated.

Pillar 3 Art. 26-27
ESA

RTS on Subcontracting Critical Functions (Art. 30)

Conditions for sub-outsourcing ICT services supporting critical or important functions and monitoring requirements for subcontracting chains.

Pillar 4 Art. 30
ESA

RTS on Designation Criteria for Critical ICT Providers (Art. 31)

Quantitative and qualitative criteria the ESAs use to designate providers as "critical" under the oversight framework.

Pillar 4 Art. 31

Related Frameworks & Practical Guidance

Guidance

ECB — TIBER-EU Framework

The Threat Intelligence-Based Ethical Red Teaming framework that DORA's TLPT is modeled after. Understand TIBER-EU and you understand TLPT.

Pillar 3 Red Teaming
Guidance

ENISA — Incident Reporting Resources

ENISA's guidance on incident classification and reporting across EU frameworks. Useful for aligning DORA reporting with NIS2 and other regimes.

Pillar 2 Incidents
Guidance

ENISA — Threat Landscape Reports

Annual EU threat landscape analysis. Useful for informing risk assessments (Art. 8) and understanding threat context for TLPT scenarios.

Pillar 1 Threat Intel
Guidance

ISO/IEC 27001 — Information Security Management

If you're ISO 27001 certified, use it as a foundation. Many DORA Pillar 1 requirements map to Annex A controls. Gap analysis recommended.

Pillar 1 Framework
Guidance

NIST Cybersecurity Framework (CSF)

NIST CSF maps well to DORA's Identify-Protect-Detect-Respond-Recover structure. Useful for building a crosswalk between your existing controls and DORA.

Pillar 1 Framework
Guidance

EBA Guidelines on ICT and Security Risk Management

Pre-DORA EBA guidelines that formed the basis for many DORA requirements. Useful to understand the evolution and for institutions already compliant with these.

Pillar 1 Legacy
Guidance

EBA Guidelines on Outsourcing Arrangements

Pre-DORA outsourcing guidelines. Many organisations built their third-party risk programmes on these — DORA builds on and supersedes them for ICT services.

Pillar 4 Legacy
Guidance

NIS2 Directive (EU) 2022/2555 — Full Text

The broader EU cybersecurity directive. Financial entities are generally exempt from NIS2 because DORA is lex specialis, but understanding the overlap helps with group-level compliance.

Related NIS2

Tools & Practical Resources

Tool

OpenCRE — Common Requirements Enumeration

Map DORA requirements to other standards (ISO 27001, NIST, CIS Controls) using OpenCRE. Excellent for building crosswalk / gap analysis matrices.

Mapping All Pillars
Tool

STIX/TAXII — Threat Intelligence Sharing Standards

The standard protocols for sharing threat intelligence. You'll need STIX/TAXII capability to participate in information sharing arrangements under Art. 45.

Pillar 5 CTI
Tool

MISP — Malware Information Sharing Platform

Open-source threat intelligence platform widely used in the financial sector. Supports STIX/TAXII, IoC sharing, and community integration.

Pillar 5 Open Source
Tool

MITRE ATT&CK Framework

Map your detection capabilities (Art. 10) and TLPT scenarios (Art. 26) to the ATT&CK matrix. Essential for structured red/blue teaming.

Pillar 2-3 Detection
Tool

NIST Risk Management Framework (RMF)

Methodology for structuring your ICT risk assessments (Art. 8). Complements DORA's requirements with a process-oriented approach.

Pillar 1 Risk
Tool

TLP — Traffic Light Protocol

Standard marking system for information sharing. Use TLP when participating in threat intel sharing communities under DORA Art. 45.

Pillar 5 Sharing

Community & Industry Bodies

Community

FS-ISAC — Financial Services ISAC

The global financial sector ISAC. Membership gives you access to threat intel feeds, sharing communities, and exercises — directly supports Art. 45 compliance.

Pillar 5 Threat Intel
Community

ECB — Euro Cyber Resilience Board (ECRB)

High-level forum for cyber resilience in the Euro area financial infrastructure. Publishes recommendations and best practices.

All Pillars Policy
Community

ENISA — EU Agency for Cybersecurity

EU cybersecurity agency. Publishes threat landscapes, best practices, and coordinates incident response across the EU. Key resource for risk assessments.

All Pillars Agency
Community

SWIFT — Customer Security Programme (CSP)

If you use SWIFT, their CSP controls overlap significantly with DORA Pillar 1. A joint assessment can reduce duplication.

Pillar 1 Banking
Tip: Links may change over time as regulators restructure their websites. If a link is broken, search the ESA/authority name + the document title. The official legal text on EUR-Lex is the most stable reference. The ESA regulatory product pages are the best entry points for finding the latest versions of RTS/ITS standards.

Title