How DORA, NIS2, and SEC Rules Are Reshaping Third Party Cyber Risk Management

Today cybersecurity regulations are raising the bar for third-party risk management (TPRM). Chief Risk Officers (CROs), Chief Information Security Officers (CISOs), and vendor risk managers at multinational companies face new rules that demand stronger oversight of vendors and service providers. Three major regulatory initiatives effective in 2025 stand out: the European Union’s Digital Operational Resilience Act (DORA), the EU’s updated NIS2 Directive, and the U.S. Securities and Exchange Commission (SEC) Cybersecurity Disclosure Rule. Each of these measures, while differing in scope and jurisdiction, converges on a common theme, ensuring organizations manage cyber risks in their supply chains and vendor relationships with rigor and transparency.

These regulations are not limited to a single industry or region. DORA targets financial entities in the EU, NIS2 broadens cybersecurity requirements across numerous critical sectors in Europe, and the SEC rule affects publicly traded companies across all industries in the United States. Together, they signal a global shift: regulators expect companies to formalize cybersecurity in third-party contracts, conduct thorough vendor due diligence, and be prepared to report on third-party cyber risks. In practical terms, this means reviewing vendor contracts for compliance clauses, asking suppliers tough security questions, and embedding third-party risk considerations into every risk assessment.

For busy risk and compliance leaders, navigating these requirements can feel daunting. This operational guide breaks down each regulation, focusing on how they impact third-party contracts and risk assessments. We’ll provide clear summaries of DORA, NIS2, and the SEC rule, then delve into tangible impacts on TPRM programs including examples of contract clauses to include and due diligence questions to ask vendors. Finally, we offer actionable compliance checklists for each regulation, so you can translate regulatory text into concrete steps. The goal is to equip you with a practical roadmap to compliance that is professional but accessible, enabling you to implement improvements in your organization’s third-party risk practices immediately.

With that foundation, let’s explore each of the major 2025 regulations in turn and how they reshape the way we manage third-party cyber risks.

Major 2025 Regulations at a Glance

Before diving deeper, here’s a quick overview of the three key regulations and why they matter for third-party risk:

  • EU Digital Operational Resilience Act (DORA) – Effective January 17, 2025, DORA is an EU regulation aimed at financial sector firms (banks, insurance companies, investment firms, etc.)[3]. It establishes a comprehensive framework to ensure these entities can withstand and recover from ICT (Information and Communication Technology) disruptions. Critically, DORA includes strict requirements for managing risks stemming from third-party ICT service providers and even sets up an oversight regime for “critical” tech vendors[4][5].
  • EU NIS2 Directive – Formally Directive (EU) 2022/2555, NIS2 came into force in January 2023 and EU Member States were required to transpose it into national law by October 17, 2024[6]. NIS2 dramatically expands the scope of Europe’s cybersecurity rules to cover 18 critical sectors (beyond the original NIS Directive’s sectors) and raises the level of cybersecurity across the EU[7][8]. It mandates that even medium and large organizations in sectors like energy, healthcare, transport, digital services, manufacturing, and more implement cybersecurity risk management measures and report significant incidents. Notably, NIS2 places strong emphasis on supply chain security, requiring companies to address the cyber risks in their supplier relationships[9][10].
  • U.S. SEC Cybersecurity Disclosure Rule – Adopted by the U.S. SEC in 2023, this new rule took effect for large public companies in late 2023 (with staged compliance into 2024) and requires publicly traded companies to disclose material cybersecurity incidents within 4 business days and to annually report on their cybersecurity risk management, strategy, and governance[11][12]. Importantly, the SEC explicitly asks companies to describe “processes to oversee and identify cybersecurity risks from third-party service providers” as part of their disclosures[13]. This rule pushes cybersecurity including vendor-related cyber risks squarely into board reports and investor disclosures, effectively obligating companies to have robust TPRM practices and incident response plans that extend to suppliers.

In the sections below, we’ll unpack each of these regulations in more detail. We’ll highlight the core requirements and then zero in on what CROs, CISOs, and vendor risk managers need to do about their third-party relationships and contracts in response.

EU Digital Operational Resilience Act (DORA)

What is DORA?
 DORA is a landmark EU regulation that harmonizes and strengthens operational resilience requirements for the financial sector. It came into application on January 17, 2025 and applies across all EU member states without the need for national implementation laws (it’s directly applicable)[14]. DORA was introduced because financial institutions are increasingly dependent on technology and third-party tech providers, making them vulnerable to cyber incidents and outages[15]. Prior to DORA, banks and others followed various guidelines (like the European Banking Authority’s outsourcing and ICT risk guidelines), but DORA creates a single, binding framework that raises the bar on digital resilience[16].

Scope: DORA covers ~20 types of financial entities including banks, insurance companies, payment firms, investment firms, credit unions, etc. essentially any EU-regulated financial sector entity[17]. It also directly affects ICT third-party service providers that serve those financial entities. In fact, DORA has a mechanism to designate certain tech providers as “critical” ICT third-party providers, which then subjects them to direct oversight by European regulators[18]. Cloud service providers, core banking software companies, payment processors, or other tech firms could fall in this category if they are integral to many financial institutions.

Key Requirements: DORA is built on five pillars of operational resilience[19]:

  • ICT Risk Management: Firms must implement an internal ICT risk management framework covering everything from risk identification to protection, detection, response, and recovery[20]. This must be integrated into overall risk management and updated regularly to address evolving cyber threats.
  • Incident Reporting: Financial entities have to monitor and report major ICT-related incidents to regulators within set timeframes[21]. DORA standardizes what constitutes a major incident and how quickly it must be reported to ensure regulators have an early warning of incidents that could impact the financial system.
  • Digital Operational Resilience Testing: Firms must conduct regular testing of their cyber defenses, including basic testing for all and advanced threat-led penetration testing for significant institutions[22]. These tests help identify vulnerabilities and ensure preparedness for cyberattacks or system failures.
  • ICT Third-Party Risk Management: Critically for our focus, DORA imposes rigorous requirements on how financial firms manage third-party ICT providers[5]. This includes conducting due diligence before onboarding a vendor, ongoing monitoring of the vendor’s performance and security measures, and ensuring the vendor contract includes specific provisions assigning responsibilities and enabling oversight.
  • Information Sharing: DORA encourages financial entities to share cyber threat information with each other to improve collective resilience[23] (for example, sharing intel on new malware targeting banking systems).

Among these pillars, the ICT Third-Party Risk Management component is particularly detailed. DORA essentially forces financial institutions to formally assess and control risks from their technology suppliers in a way that goes beyond prior practices. Let’s break down how DORA affects third-party relationships:

Impact on Third-Party Contracts: DORA Article 30 lists “Key contractual provisions” that must be included in contracts with ICT third-party service providers[24][25]. This is one of the most operationally significant parts of DORA for vendor management. Financial entities had to review and amend their supplier contracts by the Jan 2025 deadline to ensure these provisions are in place[26]. Some of the required contractual clauses include:

  • Clear Roles and Responsibilities: The contract must clearly define what services the vendor will provide and each party’s obligations for managing ICT risks[27]. No ambiguity should exist about who handles what in terms of security and resilience.
  • Service Descriptions & Subcontracting: A detailed description of all services/functions the vendor will provide, with an indication whether any critical functions can be subcontracted. If subcontracting is allowed, the contract must state conditions (e.g. requiring the financial institution’s approval)[28][29].
  • Locations of Data and Services: The regions or countries where the vendor will process data or provide services must be specified, and the vendor must agree to notify the financial institution before changing those locations[30]. This addresses data residency and geopolitical risk concerns.
  • Data Protection and Integrity: Provisions to ensure the availability, integrity, confidentiality, and authenticity of data handled by the provider[31]. Essentially, the contract should obligate the vendor to implement appropriate data security measures (including for personal data protection, aligning with GDPR where relevant).
  • Access to Data Upon Termination or Crisis: Clauses ensuring that if the vendor goes out of business or the contract is terminated, the financial institution can recover its data in a usable format[32]. This includes ensuring data can be accessed or returned during an exit, and not held hostage.
  • Service Level Agreements (SLAs): The contract should include service level metrics and performance targets, with the ability to update them as needed[33]. For critical services, SLAs must be very specific with quantitative targets, enabling effective monitoring[34][35].
  • Incident Assistance: The provider must assist the financial institution during ICT incidents at no extra cost (or at a pre-agreed cost)[36]. In other words, if there’s a cyber incident (say, a widespread malware attack) affecting the vendor’s services to the bank, the vendor can’t hold the bank ransom for more money to help resolve it. This encourages cooperative incident response.
  • Regulatory Cooperation: A crucial DORA clause is that the vendor must fully cooperate with the financial institution’s regulators and resolution authorities[37]. This means if a regulator or a resolution authority (in case the bank is in distress) asks the vendor for information or to submit to an inspection, the vendor is contractually bound to comply. Essentially, the vendor agrees to be transparent to regulators as needed.
  • Termination Rights: The financial institution must have the right to terminate the contract, with reasonable notice, particularly if the vendor breaches legal or resilience requirements[38]. DORA expects firms to avoid vendor lock-in situations that could jeopardize resilience. Contracts should include defined notice periods for termination and perhaps also conditions under which termination can be triggered (e.g. persistent SLA breaches or supervisory concerns).
  • Participation in Security Exercises: Interestingly, DORA requires that critical providers participate in the financial institution’s cyber exercises. Contracts for critical or important functions should obligate the vendor to join in threat-led penetration tests (TLPT) or other resilience testing that the financial institution conducts[39]. For example, if the bank runs a cyber range exercise or a crisis simulation, the cloud provider hosting key systems should be part of it.
  • Audit and Access Rights: Perhaps one of the most significant requirements unrestricted audit rights. The contract must allow the financial institution and its regulators (and any third-party auditor it appoints) to access the vendor’s premises, systems, and data relevant to the service, and to conduct audits and inspections at will[40][41]. The vendor must cooperate fully and cannot impede these audits. If the vendor has other clients that limit such broad audits, the contract should at least allow alternative assurance (e.g. vendor-provided audit reports) as a fallback[42], but regulators must still be able to inspect as needed. This is a game-changer for cloud and SaaS providers that historically resisted extensive audit clauses.
  • Exit Strategy & Transition: Contracts must include provisions for an orderly exit a “mandatory adequate transition period” after termination during which the vendor still provides services to facilitate migration[43][44]. For example, if a bank decides to switch providers, the old provider can’t just cut off services on day one; they have to continue for a period (say 6 months) to allow a smooth transfer of operations. The contract should also address portability e.g. assisting with data transfer to a new provider.

In practice, financial institutions have been busy re-papering their contracts to include these elements. For CROs and vendor risk managers, a DORA contract compliance review became an essential project. Regulators (like BaFin in Germany) even published guidance enumerating these minimum contract terms and expect documentation of compliance efforts[45][46].

Impact on Risk Assessments and Vendor Diligence: Beyond contracts, DORA influences how firms approach vendor risk assessments:

  • Pre-contract Due Diligence: Under DORA, before signing on a new ICT service provider, a financial entity should perform a thorough risk assessment of that provider[47]. This means evaluating the provider’s security posture, operational resilience, and ability to meet DORA’s requirements. For example, due diligence questions might cover: Does the provider have robust cybersecurity certifications or audits (ISO 27001, SOC 2)? Have they experienced major incidents and how were those handled? What controls and redundancies do they have to ensure uptime? Essentially, firms need to assess if the vendor is resilient enough and trustworthy to handle critical operations. If not, risk mitigation or choosing a different vendor might be necessary.
  • Continuous Monitoring: DORA doesn’t allow a “set and forget” approach. After onboarding, institutions must continuously monitor their third-party providers’ performance and risk levels[48][49]. This could involve regular service reviews, requiring periodic security reports from the vendor, using tools to monitor the vendor’s cybersecurity (e.g. security ratings services, as some banks do), and staying alert to any incidents the vendor faces. If a cloud provider has a major outage or a breach affecting other clients, the bank should proactively assess impact on its own operations.
  • Concentration Risk Analysis: DORA explicitly calls out concentration risk the risk of too many firms relying on the same provider[50]. Financial institutions need to analyze if they have an over-reliance on a single third-party for critical services and what systemic risk that poses. For example, if five major banks all outsource core banking to the same tech firm, that’s a concentration risk. DORA expects firms (and regulators) to consider such scenarios and potentially diversify or have contingency plans[51]. As a CRO, you might need to present to your board how you’re mitigating the risk of “all our eggs in one basket” with certain vendors.
  • Third-Party Risk Register: Firms must maintain an inventory (register) of all ICT third-party providers and the services they deliver, along with their criticality and risk assessments[52]. This centralized register should be up-to-date and available for regulators to review. It’s essentially a comprehensive mapping of your digital supply chain who your vendors are, what they do for you, and what risks they pose.
  • Resilience Testing Involving Vendors: In line with the contractual obligation mentioned, banks should include key third-party services in their resilience testing. For instance, if you conduct a disaster recovery test, involve your cloud provider to verify data restoration works. DORA even mandates that critical ICT providers submit to regulator-run testing if they’re designated as critical. From the bank’s perspective, you should test backup links, alternative providers, or in-house backups to prepare for a scenario where a vendor fails. Some firms are doing simulated “vendor down” exercises to see how quickly they can transition to contingency arrangements.

Example – Contract Clauses and Due Diligence Questions under DORA:

  • Sample Contract Clause (Audit Right): “Vendor shall grant [Financial Institution] and its regulatory authorities unrestricted right of access and audit. This includes on-site inspections of Vendor’s relevant business locations and systems, and access to all documents and resources necessary to monitor compliance with this Agreement and applicable law[40]. Vendor agrees to cooperate fully with such audits and inspections, and to promptly address any findings.” – This clause aligns with DORA’s requirement for auditability, ensuring you can verify the vendor’s controls throughout the relationship.
  • Sample Contract Clause (Incident Notification): “Vendor will notify [Financial Institution] within 12 hours of detecting any cybersecurity incident or disruption whether in Vendor’s network or subcontractors that materially affects, or has the potential to affect, the services provided. Vendor shall provide regular updates and assist [Financial Institution] in incident response at no additional cost[36].” This would help the financial institution meet both DORA’s incident response needs and its own regulatory reporting obligations (since the clock starts when you know of an incident).
  • Due Diligence Question: “Do you have a certified Information Security Management System (e.g., ISO/IEC 27001)? Please provide the latest certification or audit report.” This question checks a vendor’s baseline security maturity. Under DORA, you’d favor vendors who follow international standards, as that indicates stronger controls.
  • Due Diligence Question: “Describe your backup and recovery capabilities. If your service to us was disrupted, how quickly could you restore operations? When was your last disaster recovery test, and will you share the results?” This addresses the resilience aspect. DORA expects firms to only use third parties that can demonstrate solid business continuity. A robust answer (with evidence) from the vendor would be reassuring.
  • Due Diligence Question: “Have you ever been subject to regulatory oversight or designation as a critical ICT provider? If yes, what were the outcomes of any recent regulatory inspections?” Since DORA allows major providers to be labeled “critical” and overseen by EU authorities, asking this can reveal if the vendor is already on regulators’ radar. Even if not, it signals to the vendor that you expect compliance with regulatory standards.

In summary, DORA makes third-party risk management a board-level issue for financial institutions. It enforces discipline in how contracts are written and how vendors are vetted and monitored[5]. For CROs and CISOs, DORA compliance means working closely with procurement, IT, and legal teams to overhaul vendor contracts and implementing continuous oversight mechanisms on critical suppliers. While initially resource-intensive, these efforts pay off by reducing the likelihood of third-party failures and by demonstrating to regulators (and clients) that your organization can handle shocks to its digital ecosystem.

EU NIS2 Directive

What is NIS2?
 NIS2 is the European Union’s Network and Information Security Directive 2, an update and expansion of the original NIS Directive from 2016. It represents the EU’s broader strategy to raise cybersecurity resilience across a wide range of sectors deemed critical or important to society and the economy[7][8]. Unlike DORA (which is a regulation for finance), NIS2 is a directive meaning it set objectives that each EU Member State had to implement through national laws by October 2024[6]. As of 2025, those national laws are coming into effect, so organizations across the EU are now facing new cybersecurity obligations as a result of NIS2.

Scope: NIS2 applies to organizations termed “essential entities” and “important entities” across 18 sectors. Essential sectors include traditional critical infrastructure: energy, transportation, banking, financial market infrastructure, health, drinking water, wastewater, digital infrastructure, public administration (central level), and space. Important sectors include areas like manufacturing of critical products (e.g., medical devices, electronics), food supply, postal and courier services, chemical manufacturing, waste management, digital providers (like social media platforms), and public administration at regional level[8]. In practice, medium-sized and large companies in these sectors are in scope[53] (typically defined as 50+ employees or turnover above certain thresholds), though Member States can also include smaller ones if critical.

If you’re a CISO or risk officer at, say, a European utility company, hospital group, railway operator, or a large software company providing services, NIS2 likely applies. Each country will have a list of entities under NIS2 oversight[54]. One key point: NIS2 repealed NIS1 as of 18 Oct 2024[6], meaning it supersedes prior law and brings many more organizations into scope.

Key Requirements: NIS2 sets out high-level obligations for risk management and incident reporting:

  • Cybersecurity Risk Management Measures: Article 21 of NIS2 describes a set of minimum measures that organizations must implement. These include having policies for risk analysis and information security; incident handling processes; business continuity and disaster recovery; supply chain security; secure development practices (vulnerability handling, use of encryption, etc.); regular assessment of the effectiveness of cybersecurity measures; basic cyber hygiene and training; access controls; and the use of multi-factor authentication where appropriate[55][56]. This reads like a comprehensive checklist of cybersecurity program components essentially, NIS2 is telling companies: “Have a thorough cybersecurity program covering these domains.” Notably, clause (d) in that list explicitly calls out “supply chain security, including security-related aspects concerning the relationships between each entity and its direct suppliers or service providers.”[57]. This is a direct mandate that you cannot ignore third-party risk your cybersecurity program must address how you vet and manage suppliers.
  • Incident Reporting: NIS2 introduces strict timelines for reporting significant incidents to authorities. Article 23 outlines a three-stage process[58][59]:
  • An initial early warning within 24 hours of becoming aware of a significant incident (this is basically a heads-up to regulators/CSIRTs that something is going on).
  • An incident notification report within 72 hours with a first analysis of the incident’s nature, impact, and mitigation steps.
  • A final report within one month, providing deeper details (impact assessment, root cause, measures taken, etc.). Additionally, if an incident is especially large or spills over to other countries, there are provisions for informing the public or other Member States[60]. A “significant” incident is generally one that causes substantial operational disruption or financial losses, or affects a large number of people or other sectors. It specifically can include incidents at third-party service providers if, for example, a cloud provider outage knocks out your services, you may need to report that as a significant incident, even though the cause was at the third party[61].
  • Top Management Accountability: NIS2 explicitly holds senior management accountable for compliance. It states that management bodies can be held liable for non-compliance and that cybersecurity must get board-level attention[62]. This cultural change ensures that executives can’t delegate away cybersecurity they must actively oversee it, including supply chain risks. Boards of companies in scope should expect to receive regular updates on cybersecurity and TPRM, and could face penalties if their organization is found egregiously non-compliant after an incident.
  • Enforcement and Fines: While specifics depend on national laws, NIS2 sets a baseline for penalties (which can be quite significant, potentially ranging up to millions of euros or a percentage of turnover for the most severe violations). Regulators in each country have powers to audit, issue binding instructions, or sanction companies that don’t meet the requirements. This adds teeth to the obligations.

For third-party risk managers, NIS2’s most significant message is that supply chain cybersecurity is now mandatory, not optional[63]. Under NIS2, if you’re an in-scope entity, you must systematically manage the cybersecurity of your vendors, suppliers, and service providers. Let’s detail what that entails:

Impacts on Third Party Risk Management under NIS2:

  • Supply Chain Risk Management Program: Companies need a formal program to address supply chain cyber risks[64]. This means identifying your critical suppliers, assessing the cyber risks they pose, and integrating that into your overall risk assessments. If you previously had a patchy vendor review process, now it needs to be robust and documented. For example, a comprehensive supply chain risk management strategy might involve categorizing vendors by criticality (Tier 1, Tier 2, etc.), performing annual risk assessments for key vendors, and having specific security requirements for them.
  • Supplier Security Expectations: NIS2 (and likely your national implementing law) expects organizations to hold their suppliers accountable for cybersecurity[65]. In practice, this means you should incorporate cybersecurity requirements into contractual arrangements with suppliers. Contracts with vendors should include clauses requiring them to implement appropriate security measures, comply with certain standards, and even allow audits similar in spirit to DORA’s clauses. A concrete example: you might add a clause that “Supplier shall maintain an information security program in line with industry best practices (ISO 27001 or equivalent) and will provide upon request evidence of compliance, such as audit reports or certificates. Supplier agrees to security audits by or on behalf of [Your Company] once per year.” Also, you’d include incident notification clauses (e.g. supplier must notify you within X hours of any breach). Bitsight’s analysis of NIS2 stresses setting clear expectations in contracts and conducting regular vendor audits to ensure compliance[66].
  • Vendor Due Diligence: Before onboarding new vendors, especially those that will have access to sensitive data or systems, in-scope companies should conduct thorough cybersecurity due diligence. This could involve sending detailed security questionnaires, reviewing the vendor’s policies, and checking independent evidence of their security posture. One recommended practice is to require vendors to have cybersecurity certifications or attestations for instance, asking if they have ISO 27001 certification or SOC 2 Type II audit reports[67]. Some organizations might ask for Software Bills of Materials (SBOMs) for software providers, as managing vulnerabilities in third-party components is part of supply chain security (indeed, some EU countries are making SBOMs mandatory for certain products to meet NIS2 objectives[68]).
  • Continuous Monitoring of Suppliers: It’s not enough to vet a vendor once. NIS2 implies ongoing oversight. Companies are wise to implement continuous monitoring for example, using tools or services that monitor suppliers’ networks for breaches or weaknesses, or simply scheduling regular reassessments. If a major vulnerability (like “Log4Shell”) emerges, you should be able to quickly identify which vendors might be affected and ensure they take action. Regular audits, either direct or via third-party assessments for high-risk vendors are now a norm under NIS2’s regime[69].
  • Incident Response Including Third Parties: Your incident response plan must account for incidents at vendors[70]. If a critical supplier suffers a ransomware attack, how will you respond? NIS2 requires that you report significant incidents (which could originate at a third party affecting your service) within tight timelines. That means you need your vendors to tell you quickly if they have a problem. Many companies are now contractually obligating vendors to notify them of incidents within 24 hours or even sooner, precisely so that the company can fulfill the NIS2 24-hour “early warning” rule. Internally, ensure your incident response playbooks have scenarios for third-party incidents including who in your team contacts the vendor, how you evaluate the impact, and how you escalate to regulators.
  • Collaboration and Info Sharing: NIS2 encourages information sharing (through CSIRTs and the NIS Cooperation Group) among companies and authorities. Organizations should be prepared to share relevant information about threats or incidents. For example, if you discover through a vendor assessment that a certain software component is vulnerable, sharing that intel with an ISAC (Information Sharing and Analysis Center) or industry group might be part of best practices. It’s not a direct requirement, but it’s in the spirit of collective security that NIS2 promotes.
  • Management and Board Oversight: Because top management is accountable, expect more leadership interest in third-party cyber risk. Boards may ask: “What are we doing about supply chain security? Have all our critical suppliers been vetted? How do we know they’re secure?” As a TPRM professional, you should be ready to answer these questions with data: e.g., “Out of 50 critical suppliers, 45 have ISO certifications, 5 we are working closely with to improve; we conduct annual onsite assessments of the top 10 providers; we have incident notification agreements with all critical suppliers,” etc. You might even present a dashboard of vendor risk to the board. This also means any gaps in vendor security could become board-level issues in the event of an incident.

Example – Contract Clauses and Due Diligence Questions under NIS2:

  • Sample Contract Clause (Security Requirements): “Supplier shall implement and maintain appropriate cybersecurity measures commensurate with the sensitivity of the services provided, including but not limited to: network security controls, up-to-date patch management, encryption of sensitive data in transit and at rest, and employee security training. Upon request, Supplier will provide [Your Company] with up-to-date documentation of its cybersecurity program and evidence of compliance with recognized security standards.” – This clause sets a general expectation that the supplier actively maintains good security (covering areas explicitly mentioned in NIS2 like patching and training[56][71]). It gives you leverage to ask for proof (e.g., policies, certificates).
  • Sample Contract Clause (Right to Audit and Assess): “[Your Company] or its designated representatives may audit Supplier’s security controls at least annually. Supplier agrees to cooperate with audits, including on-site inspections or penetration testing as applicable, and shall remediate any critical findings within an agreed timeframe. Additionally, Supplier will complete [Your Company]’s cybersecurity questionnaire and provide relevant artifacts (e.g., SOC 2 report) upon request.” This aligns with the NIS2 emphasis on regular audits and supplier accountability[66][69]. It ensures you have the contractual right to verify the supplier’s claims.
  • Sample Contract Clause (Incident Reporting): “Supplier shall notify [Your Company] immediately (no later than 24 hours) after becoming aware of any security incident, breach, or significant cyber threat affecting Supplier’s systems that provide or support the services. Such notice shall include known details of the incident. Supplier shall subsequently provide updates and a post-incident report detailing root cause and remediation.” – This is critical for NIS2 compliance so you can meet the 24-hour reporting rule for significant incidents[59]. It also protects you by ensuring you’re not in the dark if a vendor gets hit.
  • Sample Contract Clause (Data Return/Deletion): “Upon termination of the contract or at [Your Company]’s request, Supplier will securely return or dispose of all [Your Company] data in its possession. Disposal must include permanent deletion from backups and must be certified in writing to [Your Company].” – While this is more of a general data protection clause, it’s a good practice to include as part of mitigating supply chain risk (ensuring vendors don’t retain your sensitive data when they no longer should). NIS2’s focus on security extends through the lifecycle of data, and safe disposal is part of that (though not explicitly stated in the directive, it’s implied in good security hygiene).
  • Due Diligence Question: “What third-party cybersecurity certifications or audits do you have? (e.g., ISO 27001, SOC 2 Type II, Cyber Essentials) Please provide the latest certificate or report.” A positive answer (with evidence) gives confidence. Bitsight’s guidance notes that requiring such certifications is an impactful method to gauge supplier maturity[67].
  • Due Diligence Question: “How do you manage security in your own supply chain? Do you vet and monitor any sub-contractors or service providers you rely on for delivering services to us?” This question addresses fourth-party risk. NIS2 requires you to consider security “concerning the relationships between each entity and its direct suppliers”[72]. A good supplier should also have a handle on their suppliers. If a cloud provider uses a data center colocation facility, do they audit that facility’s security? Such questions can reveal if the vendor is cascading security requirements downwards.
  • Due Diligence Question: “Have you had any cybersecurity incidents in the past 12 months? If so, what did you learn and what changes were made to prevent future incidents?” Under NIS2, if this supplier had a “significant incident” that could affect you, you’d want to know. While they might not disclose everything, a trustworthy vendor should at least share if they had noteworthy incidents and how they improved. This also tests their transparency and incident response capability.

By implementing the above measures, companies align themselves with NIS2’s goals. Essentially, NIS2 forces organizations to take a proactive, structured approach to third-party cyber risk, very similar to how they manage internal cyber risks. The directive’s broad sectoral reach means that even industries that historically didn’t focus heavily on cybersecurity (e.g., water utilities, manufacturing plants) must now elevate their game, including scrutinizing the security of their vendors, suppliers, and service partners.

For vendor risk managers, NIS2 is an opportunity to advocate for stronger TPRM programs across all departments. You might need to educate procurement and business units that these security questionnaires and contract addendums are not just red tape they’re vital for compliance and resilience. And for CISOs, NIS2 provides leverage to secure budget for supply chain security initiatives (like tools for third-party monitoring, hiring staff to do vendor audits, etc.), because non-compliance is not an option when regulators are watching.

One more note: since NIS2 is implemented via national laws, keep an eye on your country’s specific requirements. Some nations might extend obligations or provide more detailed guidance. For example, one country might explicitly require that you perform annual penetration testing, or that you report incidents to a specific portal. Always cross-check the local regulation, but the principles above hold universally under NIS2.

Pro Tip: As regulatory expectations evolve, investing in staff expertise is key. Professional certifications like the Certified Third Party Risk Management Professional (C3PRMP), offered by the Third Party Risk Institute, are designed to elevate your team’s capabilities in managing vendor risks. In fact, C3PRMP is considered a “gold standard” credential in vendor risk management, recognized by industry associations[1][2]. Building such expertise can help ensure your TPRM program meets the latest global standards and best practices demanded by regulations.

U.S. SEC Cybersecurity Disclosure Rule

What is the SEC Cybersecurity Rule?
 In July 2023, the U.S. Securities and Exchange Commission (SEC) adopted a set of rules aimed at standardizing cybersecurity disclosures by public companies[73]. Often referred to as the SEC Cybersecurity Disclosure Rule, it introduced two main requirements for companies that are subject to SEC reporting (generally, U.S.-listed companies and foreign private issuers who file with the SEC):

  1. Material Incident Reporting (Form 8-K/6-K): Companies must file a public disclosure of any “material” cybersecurity incident on Form 8-K within four business days of determining an incident is material[74]. “Material” generally means it’s important enough that a reasonable investor would consider it important (significantly impacts finances or operations). This could be a ransomware attack, data breach, or even a third-party incident that materially affects the company.
  2. Annual Disclosure of Cyber Risk Management, Strategy, and Governance (Form 10-K/20-F): In their annual report, companies must describe their processes for assessing and managing cybersecurity risks, their approach to cybersecurity governance (including board oversight and management roles), and whether any board members have cybersecurity expertise[75]. This includes disclosing how the company manages risks from third-party service providers and supply chain cybersecurity issues[13].

The rules became effective in late 2023. Large companies were expected to comply with incident reporting by December 2023, and the annual disclosures are required starting with reports for fiscal years ending on or after December 15, 2023 (meaning 2024 annual reports filed in early 2025 must have this info)[76]. Smaller companies had a few more months for the incident reporting requirement.

Why does this matter for third-party risk? The SEC explicitly highlighted that an increasing number of cybersecurity incidents originate at or involve third-party vendors[77]. High-profile examples include the SolarWinds breach of 2020, where attackers compromised a third-party software provider to infiltrate many companies and government agencies. In fact, the SEC’s enforcement division recently penalized companies for misrepresenting the impact of that very incident[78]. The SEC wants investors to be fully informed about a company’s exposure to cyber threats and its ability to manage them including threats that come through suppliers and partners.

Key Requirements Impacting TPRM:

  • Disclosure of Third-Party Risk Management Processes: In the annual report section on cyber risk management, companies need to state whether and how they consider third-party risks. The SEC’s rule text specifically asks registrants to disclose “whether the registrant has processes to oversee and identify such risks from cybersecurity threats associated with its use of any third-party service provider[13]. This means if you’re a public company CISO or CRO, you should be prepared to publicly describe your third-party cybersecurity program. For example, a disclosure might read: “We maintain a third-party risk management program that assesses the cybersecurity of critical vendors, requiring them to adhere to our security standards and monitoring their compliance. We utilize vendor risk assessments, contractual security requirements, and continuous monitoring tools to manage third-party cyber risks.” Investors (and regulators) will read this, so it needs to be truthful and adequate. If a company says nothing about third-party risk, that omission could raise red flags, given how common third-party breaches are.
  • Incident Reporting and Third Parties: If a cybersecurity incident at a vendor materially affects the company, it triggers the 4-day disclosure requirement just as if the incident happened in-house. For example, imagine your company relies on a cloud provider and that provider has an outage or breach that stops you from serving customers for a week that could be material. The SEC expects you to disclose it. Practically, this means you need to have mechanisms to rapidly learn about incidents at your vendors. A contractual notification obligation (like we discussed under NIS2) is crucial here. You don’t want to be caught off guard by a question on an earnings call like “we heard your supplier was breached, why didn’t you disclose it?” Companies are now asking vendors to inform them immediately of any incident that could impact them, so the company can meet the 4-day filing deadline if needed.
  • Internal Controls and Governance: The SEC rule indirectly forces companies to strengthen their internal cybersecurity governance. Because boards must oversee cyber risks and sign off on disclosures, they are asking management tougher questions. Boards might demand evidence of a working third-party risk management process. It’s wise to implement formal reporting to the board on cybersecurity and third-party risk. Some companies have created management committees or upgraded policies to ensure they can demonstrate a structured approach. The mantra “if it’s not documented, it didn’t happen” applies – regulators and investors expect to see that you have defined processes (like vendor assessment procedures, incident response plans including vendor scenarios, etc.).
  • Accuracy and Transparency: The SEC has shown it will penalize companies for making misleading statements or omissions about cyber incidents[79][80]. For example, if a company downplays an incident’s impact or fails to mention that a breach occurred via a supplier, it could face enforcement. This places a premium on honesty in reporting. For TPRM, it means you should avoid rosy assumptions like “all our vendors are secure” in internal comms or external reports. Instead, acknowledge risks and what’s being done to mitigate them. If a material vendor risk exists (say one of your critical suppliers is vulnerable to a known issue but you’re working on it), better to be candid in the risk factors section of filings. In the long run, trust is built through transparency.

Impacts on Third-Party Risk Management Program:

  • Enhanced Vendor Due Diligence: Public companies are reevaluating how they vet vendors at onboarding and beyond. Since any lapse can lead to an incident that must be disclosed, there’s a strong incentive to choose vendors with good security and to continuously check them. Expect more rigorous security questionnaires sent to vendors and more frequent refresh (annual or even quarterly for the most critical ones). Companies might also integrate external ratings or intelligence e.g., using a security rating service that monitors vendors’ internet-facing security (some companies do this to get a heads-up if a vendor has an exposed vulnerability or a data leak on the dark web). The objective is to avoid material surprises from vendors.
  • Contractual Risk Transfer and Assurance: Another trend is companies updating contracts to include provisions that help with compliance and risk mitigation for the SEC rule:
  • Incident Cooperation Clauses: e.g., “Vendor will provide all information necessary to assist [Company] in investigating, remediating, and reporting any cybersecurity incident, including those required by law or regulation (such as SEC disclosures), without delay.” This ensures if something happens, the vendor can’t drag its feet you’ll need their cooperation to figure out impact and draft accurate disclosures within days.
  • Representations and Warranties: Many companies now include reps/warranties where the vendor attests that it has no undisclosed security incidents at the time of contracting, and it complies with certain cybersecurity standards. If a vendor lies in those, it could be breach of contract, giving the company legal recourse.
  • Indemnification for Data Breaches: Some contracts include clauses that the vendor will indemnify the company for costs related to data breaches caused by the vendor’s negligence. While this is more about financial risk, it’s part of the overall risk management strategy. If a breach at a vendor leads to shareholder lawsuits or regulatory fines (which can happen after an SEC disclosure), having an indemnity clause might allow you to recover some costs from the vendor.
  • Third-Party Risk Assessment Integration: The SEC rule implicitly requires that the company’s risk assessment process covers third parties. FINRA (the U.S. Financial Industry Regulatory Authority) has even recommended that firms use the SEC’s guidance as a checklist to ensure all cyber risks, including third-party, are identified and managed[81]. Companies are now integrating vendor risk into enterprise risk management. This might mean when you do your annual enterprise risk assessment, “vendor cybersecurity risk” is listed as a key risk with an owner, mitigation plans, etc. Also, internal audits are increasingly auditing TPRM programs to ensure they’re up to scratch, given the regulatory focus.
  • Continuous Improvement and Training: The SEC rule has prompted companies to invest in cybersecurity awareness at all levels. For instance, teams involved in vendor management might receive training on the importance of timely incident escalation. The legal department, security team, and procurement may hold joint exercises to simulate a vendor breach and how they’d handle the disclosure. Mitratech noted the importance of ongoing training and awareness, so staff understand the legal obligations and consequences of cybersecurity issues[82]. This extends to making sure that those who interface with vendors (vendor managers, contract managers) know to immediately flag any hint of a security issue up the chain.
  • CISO and CRO Collaboration: More than ever, CISOs (technology security) and CROs (enterprise risk) or equivalent roles must collaborate on third-party risk. The CISO might own the technical evaluation of vendors (ensuring security requirements are met), while the CRO ensures the risk is tracked in risk registers and reported. Both will likely be involved in drafting the annual disclosure text about cyber risk. Many companies form a cross-functional Cyber Risk Committee that includes IT, security, risk management, legal, and supply chain, to oversee compliance with the SEC rule. This group might review incidents, decide if they’re material, and ensure the required processes (like board updates) happen.

Example – Contract Clauses and Due Diligence Questions under the SEC Rule:

  • Sample Contract Clause (Regulatory Cooperation): “Vendor acknowledges that [Company] is subject to certain cybersecurity disclosure regulations (e.g., SEC rules). Vendor shall promptly provide information requested by [Company] that is necessary for [Company]’s compliance with such regulations, including details of cybersecurity incidents, vulnerabilities, or risks affecting Vendor’s services to [Company].” This clause makes it clear the vendor must feed you the info you need to comply with your legal duties. It sets the expectation upfront that you might come asking for detailed information, and they can’t refuse on grounds of confidentiality or PR concerns.
  • Sample Contract Clause (Breach Notification & Remediation Costs): “In the event of a security breach at Vendor affecting [Company]’s data or operations, Vendor shall (a) immediately notify [Company] as per the agreed SLA, (b) take all necessary steps to remediate the breach, (c) cooperate fully with [Company]’s investigation, and (d) indemnify [Company] for all reasonable costs incurred in responding to and recovering from the breach, including costs of notifications, credit monitoring, forensic investigations, and regulatory penalties, to the extent the breach was caused by Vendor’s negligence or willful misconduct.” While a bit strong, clauses like this are increasingly common. They both ensure quick notification (for compliance) and attempt to financially protect the company. Even if you can’t negotiate full indemnity, the act of discussing it will make the vendor acutely aware of how seriously you take breaches.
  • Due Diligence Question: “Who in your organization is responsible for cybersecurity and what is your governance structure? (Do you have a CISO or similar role? How often do they report to your executive team or board?)” This aligns with the SEC rule’s focus on governance and can be telling. A vendor with a named accountable security officer and regular leadership oversight is likely managing risk better than one without. If a vendor answers, “We don’t have a dedicated security officer, our IT manager handles it part-time,” that’s a red flag for a critical vendor.
  • Due Diligence Question: “Can you provide a summary of any cybersecurity incidents in the past 3 years and how they were resolved? Were any customers or regulators notified?” This directly probes their incident history. You’re gauging both frequency/severity of incidents and transparency. If a vendor had a breach and didn’t tell clients, you definitely don’t want to learn that post-contract. Conversely, if they candidly describe a minor incident and improvements made, that builds trust. Since the SEC expects your company to disclose incidents, you need vendors who will be open about incidents with you.
  • Due Diligence Question: “Do you carry cyber insurance that covers incidents impacting clients? If yes, what does it cover?” This is a more business-oriented question, but useful. If a vendor has robust cyber insurance, it might indirectly protect you (e.g., funds to handle incident response which means they recover faster, or cover liabilities which might include your losses). It also indicates they take cyber risk seriously enough to insure against it.
  • Due Diligence Question: “Describe how you assess and manage risk from your third-party providers (fourth-party risk).” Similar to the NIS2 question earlier, but even a U.S. company should consider it. The SEC rule doesn’t explicitly mention fourth parties, but if your critical vendor has a critical vendor, that chain can be a source of material risk. Asking this informs your risk narrative e.g., if a vendor says “we outsource development to a company in another country”, you now know an additional layer of risk to consider.

In essence, the SEC Cybersecurity Disclosure Rule is less prescriptive about how you manage third-party risk (compared to DORA/NIS2), but it creates powerful transparency and accountability pressures. No company wants to be the subject of an SEC enforcement action or class-action lawsuit because they claimed “all is well” with cybersecurity when in reality they had poor vendor oversight and got breached. Thus, public companies are using this rule as a catalyst to shore up their TPRM programs: formalizing vendor risk committees, investing in better tools, and revising policies.

For CISOs and vendor risk managers, this is an opportunity to get executive buy-in. You can go to the board or audit committee and say, “The SEC is requiring us to disclose our cybersecurity and third-party risk management effectiveness. We need to ensure we have a strong story to tell and that means improving X, Y, Z in our program.” In many cases, leadership will support initiatives like deeper assessments of key vendors, better contractual protections, or additional staffing to manage third-party risk because the regulatory mandate makes the risk tangible.

Finally, note that while this rule directly impacts public companies, its effects ripple out. Private companies that are vendors to public ones may feel the indirect pressure (as their customers demand more assurances and clauses now). And the trend in other jurisdictions (UK, other countries) is also toward requiring more disclosure of cyber preparedness. So, it’s wise for any sizable company, even if not SEC-regulated, to align with these practices as a forward-looking measure.

Now that we’ve explored each major regulation and its implications on third-party risk and contracts, the next section will distill this into actionable guidance. We’ll present checklists for compliance with DORA, NIS2, and the SEC rule, focusing on what steps you should take in your third-party risk management program to meet the requirements and, ultimately, to enhance your organization’s resilience.

Compliance Checklists for TPRM under DORA, NIS2, and SEC

Regulations can be complex, but breaking down the requirements into concrete tasks can make compliance manageable. Below are checklists tailored for each regulation, with a focus on third-party risk management actions. Use these as a starting point to assess your readiness and track progress. Each checklist is generalized for a multinational context (remember to adapt to any specific local law nuances).

DORA Compliance Checklist (Financial Sector Focus)

For financial entities in the EU, ensure your third-party risk management aligns with DORA:

  • Identify ICT Third Parties & Critical Services: Compile an up-to-date inventory of all your ICT service providers. Flag which ones support critical or important functions (those whose disruption could significantly impact your operations or compliance)[24][83]. This inventory (register) should be readily available and includes details like services provided, data shared, and renewal dates.
  • Conduct Pre-contract Due Diligence: Before onboarding any new tech vendor, perform a thorough risk assessment of their security and resilience. Use standardized questionnaires and require evidence (certifications, audit reports)[47]. For high-risk vendors, involve cybersecurity experts to do in-depth evaluations or even on-site audits.
  • Update Contracts with DORA Clauses: Review all existing contracts with ICT providers and add any missing DORA-required provisions. At minimum, ensure contracts include:
  • Clear service descriptions and roles[27][28].
  • Data location disclosure and change notification[30].
  • Data security and confidentiality commitments[31].
  • Incident notification and assistance obligations (e.g., “notify within X hours, assist without extra charge”)[36].
  • Regulator cooperation clause (vendor agrees to facilitate inspections/information requests by authorities)[37].
  • Audit and access rights for you and regulators[40][41].
  • Subcontracting conditions (require approval for sub-outsourcing critical functions)[29].
  • Termination and exit strategy provisions (including a transition period for handover)[43][44].
  • Service level agreements with measurable uptime/security metrics, especially for critical services[34][84].

If any key provider resists these changes, document the risk and have mitigation plans (and be prepared to justify to regulators why an exception exists). Regulators expect these as minimum contract contents[45].

  • Establish Continuous Vendor Monitoring: Implement a process to regularly monitor third-party performance and risk:
  • Hold periodic service review meetings with vendors, reviewing SLA reports and security updates.
  • Require critical vendors to provide annual cybersecurity self-assessments or third-party audit results to you.
  • Use tools (if available) to get alerts on your vendors (e.g., if a vendor’s system is found in a botnet or their SSL certificate expires indicators of issues).
  • Track news or advisories about your key vendors (e.g., subscribe to threat intel or news feeds).
  • Incident Reporting Protocol (Internal): Adjust your internal incident response plan to incorporate DORA’s reporting requirements. Ensure that if a vendor-related incident occurs, you can gather information quickly and report major incidents to your national regulator within the required time frame[21]. Assign responsibility to a team/person for regulatory incident reporting. Conduct a drill to simulate a major vendor outage and practice the reporting workflow.
  • Test Third-Party Resilience: Include scenarios involving third-party failures in your digital operational resilience testing:
  • Conduct at least annual continuity tests where you simulate the unavailability of a critical supplier (e.g., switch to backups or alternate providers) and see if you can still operate.
  • If you’re required to do threat-led penetration testing, coordinate with critical vendors to include interfaces between you and them[39].
  • Share test results with the vendor and work on any gaps (e.g., if a failover from Vendor A to Vendor B took too long, improve that together).
  • Monitor Concentration Risk: Yearly (or more frequently), review whether you have excessive dependence on a single vendor or a small group of vendors[50]. If yes, develop a risk mitigation: maybe diversify suppliers, or have stronger contingency plans. For example, if all your critical systems are on one cloud provider, consider a multicloud strategy or at least a plan to migrate quickly if needed.
  • Keep Documentation and Evidence: Maintain documentation of all the above due diligence records, contract amendments, monitoring reports, incident logs, test results, etc. DORA compliance will likely be evaluated by regulators via audits or supervisory questionnaires. Being able to show evidence (e.g., “here’s our contract checklist and the updated contracts,” “here’s the report of our last vendor risk review meeting”) will demonstrate your compliance[85]. It’s also invaluable for internal accountability and continuous improvement.
  • Train Staff on DORA Requirements: Ensure teams involved in outsourcing/procurement and IT are aware of DORA obligations. Provide guidance or training to vendor relationship managers about the new contract terms and why they’re important, so they can better negotiate and manage vendor expectations.

By following these steps, your organization will not only meet DORA’s third-party risk mandates[5] but also greatly reduce the likelihood of an unmitigated third-party incident disrupting your operations.

NIS2 Compliance Checklist (All Sectors – Critical/Important Entities in EU)

If your organization falls under NIS2 (check your sector and size!), use this checklist to bolster supply chain cybersecurity:

  • Verify NIS2 Applicability: Confirm that your organization is listed as an essential or important entity in your country’s NIS2 legislation. If unsure, consult the national authority or industry regulator. This determines your compliance obligations (and reporting channels). Also identify which regulator or CSIRT you would report incidents to (it may vary by sector/country).
  • Implement a Cybersecurity Risk Management Program: Ensure you have a documented cybersecurity strategy/policy that covers all the areas listed in NIS2 Article 21[55][86]. Specifically:
  • Have a cybersecurity policy approved by management.
  • Include risk assessment processes (identifying threats and vulnerabilities regularly).
  • Ensure you have incident handling procedures (and that staff know them).
  • Have business continuity and disaster recovery plans for ICT disruptions, and test them.
  • Incorporate supply chain security into your policies e.g., a vendor security policy or section that outlines how third-party risks are managed[57].
  • Set up metrics or assessments to evaluate effectiveness of your controls (e.g., internal audit or security testing results).
  • Cover basics like access control, data encryption, multi-factor authentication (for remote/admin access)[71] these reduce both direct and third-party risks (e.g., requiring multi-factor for your contractors too).
  • Provide cybersecurity awareness training for employees and relevant third parties (perhaps require your vendors’ staff who handle your info to be trained as well)[87][71].
  • Embed Supply Chain Security in Procurement: Update your procurement and vendor management procedures to align with NIS2:
  • Include cybersecurity criteria in vendor selection (e.g., security questionnaire as part of RFP, minimum security requirements as a qualifier).
  • Establish a vendor risk rating system (high/medium/low risk vendors) based on access to critical systems or data.
  • For high-risk vendors, mandate a security review or approval by the InfoSec team before contract signing.
  • Check if any of your suppliers themselves fall under NIS2 (if yes, they have obligations too which is good, you can ask them how they comply).
  • Strengthen Contracts with Security Clauses: As with DORA, ensure contracts with suppliers include:
  • Obligations to implement appropriate security measures (possibly reference standards or specific controls).
  • Audit rights for you or third-party assessors[88].
  • Breach notification requirements (immediate or within 24 hours)[89].
  • Data protection and data return/deletion clauses (especially if personal data is involved, this dovetails with GDPR too).
  • Right to request remediation of any identified security gaps.
  • Compliance with applicable cybersecurity laws (e.g., “Vendor will comply with all applicable cybersecurity regulations” which indirectly covers NIS2 if it applies to them, or at least sets expectation to cooperate with your obligations).

Work with your legal team to develop standard “NIS2 addendum” language that can be appended to all critical vendor contracts if not already included. This makes rollout easier.

  • Conduct Supplier Security Assessments: Perform a security risk assessment of key suppliers at least annually[90]:
  • Use a detailed questionnaire or framework (some companies use ISO 27001 controls as a baseline, or the SIG questionnaire).
  • If resources allow, do on-site audits or interviews with the most critical suppliers.
  • Obtain and review SOC 2 reports, ISO certificates, penetration test results, or vulnerability scan reports from suppliers where possible[91].
  • Rate each supplier’s residual risk (e.g., “Supplier X: medium risk due to some gaps in patch management accepted with mitigation that we monitor monthly”).
  • Track findings and ensure there’s a mechanism to follow up on any required remediation by the supplier. For example, if a vendor lacks an incident response plan, ask them to develop one, and consider holding quarterly meetings until it’s done.
  • Enhance Monitoring & Communication:
  • Set up clear internal channels to escalate vendor-related cyber issues. For instance, if anyone in the company (not just IT) gets wind of a vendor having trouble (maybe a news article about your supplier getting hacked), they should know to alert the security team immediately. Speed is crucial for 24-hour reporting[59].
  • Maintain up-to-date contact lists for all critical vendors (including after-hours/emergency contacts). You don’t want to scramble for who to call at a vendor during an incident.
  • Consider joining industry information-sharing groups (many sectors have ISACs Information Sharing and Analysis Centers). They often share third-party risk information and threat intel which can give early warnings of supply chain threats.
  • Prepare for Incident Reporting: NIS2’s tight timelines mean you should have a pre-established process:
  • Define what constitutes a “significant incident” for your organization (likely anything causing major outage, large data loss, or safety risk).
  • Create an incident notification template now, to be ready to fill in quickly. It might include fields like description of incident, affected systems, mitigation steps, cross-border impact, etc. Reference Article 23’s requirements for what the final report needs[60] severity, impact, type of threat, root cause, measures taken, etc.
  • Train your incident response team on how to assess impact swiftly, so you can meet the 24h/72h deadlines. Perhaps designate an “incident reporting officer” who will handle communications with the regulator.
  • Coordinate with your PR/communications team too because NIS2 allows authorities to require you to notify the public in some cases[92]. Have a plan for public disclosures (like a press release draft) if an incident must be made public to prevent or handle ongoing incidents.
  • Management Oversight & Accountability: Involve senior management in TPRM:
  • Provide regular updates to your executive committee or board on the cybersecurity posture of your supply chain. This could be part of a quarterly risk report.
  • Document these briefings to show regulators (if ever needed) that management is informed remember top management can be held accountable for non-compliance[93].
  • If gaps are identified in third-party risk (say you have 100 suppliers but only 20 have been assessed so far), get a plan approved by management to close those gaps, showing a commitment from the top.
  • Stay Informed on National Guidance: Keep an eye out for guidelines from your country’s authorities or the EU’s cybersecurity agency (ENISA). For instance, ENISA might publish best practices for NIS2 supply chain security, or your regulator might issue specific rules (like mandatory use of certain crypto, or a specific format for incident reports). Adapt your program as these emerge. Subscribe to regulatory newsletters or industry forums where these updates are discussed.

By systematically executing these steps, you’ll move your organization toward solid NIS2 compliance. More importantly, you’ll greatly reduce the risk that a supplier incident catches you off-guard or that a regulator finds your supply chain security insufficient[94][69]. NIS2 is about building cyber resilience across the board, and supply chain is a big piece of that puzzle.

SEC Cybersecurity Disclosure Rule Compliance Checklist (Public Companies)

For publicly traded companies (or those planning IPOs), align your cybersecurity and third-party risk practices with SEC expectations:

  • Integrate Cybersecurity into ERM and Board Reporting: Make sure cybersecurity risk, including third-party risk, is part of your Enterprise Risk Management (ERM) framework. Maintain a risk register entry for third-party cyber risk with an assigned owner and regular review. Provide updates to the board or a board committee (e.g., Audit or Risk Committee) at least quarterly on cybersecurity. Document these meetings and topics covered (this will help in drafting disclosures and demonstrating governance). Essentially, ensure you can truthfully say “the board is informed and involved” in the 10-K disclosure.
  • Create a Narrative of Your Cyber Risk Management: Work with your CISO, CIO, and risk managers to formulate a clear description of how your company manages cyber risks. This should cover:
  • Identification of risks (e.g., continuous monitoring, threat intelligence).
  • Protection measures (security architecture, policies, vendor standards).
  • Detection and response (24/7 SOC, incident response plan).
  • Recovery (backups, business continuity).
  • Governance (roles and responsibilities, e.g., “We have a CISO who reports to the CTO and briefs the Board quarterly. We have a cyber risk committee…”).
  • Third-party oversight (e.g., “We assess critical vendors annually and require security terms in contracts”)[95].

This narrative will feed your annual report disclosure. Have it reviewed by legal counsel and ensure it matches reality. If there are areas of weakness (say you don’t yet assess all vendors annually), either improve the practice or be transparent about it. It’s better to say “we prioritize top vendors for annual assessment and are expanding the program” than to mislead.

  • Revise Vendor Risk Management Program for Transparency: Strengthen any weak spots in your TPRM so you can confidently disclose it:
  • If you haven’t already, categorize vendors by criticality (tiering). The SEC doesn’t force you to, but it’s a common and sound practice to justify resource allocation.
  • Ensure you perform some level of due diligence on all critical vendors (so you can’t be accused of ignoring an obvious risk).
  • Implement continuous monitoring for critical third parties whether via asking them quarterly attestations or using an external service so you can say “we continuously oversee key third-party risks”[96].
  • Keep records of all vendor assessments and risk decisions. If ever challenged (e.g., after an incident, plaintiffs might request evidence of your risk management), you can show a considered approach.
  • Incident Response Readiness (4-Day Rule): Sharpen your incident response process specifically for disclosure:
  • Establish an internal policy that any potential material incident is escalated to an incident response team and a disclosure review team (likely legal, IR, communications) immediately.
  • Define “materiality” triggers. While materiality is judgment-based, give guidance: e.g., “customer data breach of X records, downtime of Y hours impacting revenue, incident causing more than Z% of assets in damages, etc., should be considered for disclosure.” This helps teams on the ground know what to flag.
  • Have a draft Form 8-K template for cyber incidents ready. It should have placeholders for date of incident, nature of incident, scope of impact, etc. Having a template speeds up the preparation when time is ticking.
  • Coordinate with your General Counsel and external counsel (if needed) on the process for making the disclosure. Some companies set up a special committee (like the Disclosure Committee) to vote on materiality decisions quickly.
  • Don’t forget that the 4 business day timeline starts ticking once you determine an incident is material, not from the incident occurrence. So, refine your internal decision-making: you might decide materiality within, say, 48 hours of detection to leave time for drafting and filing.
  • Vendor Incident Notification & Contingency: Because a lot of incidents can involve third parties (as SEC highlighted[77]), do a couple of things:
  • Revisit contracts to ensure vendors must inform you of incidents promptly (if not already done in earlier steps). If a key vendor is breached and doesn’t tell you for a week, you can’t meet SEC deadlines unacceptable. Negotiate amendments if needed for at least the most critical third parties.
  • Develop a playbook for third-party incidents: if a vendor calls and says “we were hacked,” your team should know: who gathers info from the vendor, how do we assess our exposure, do we need to shut off integrations, etc., and who decides if it’s material to us.
  • If you rely on third parties for critical operations, consider setting thresholds that if they go down for over X hours, it might be material. E.g., if your entire website is hosted by a third party and it’s down for a day, that’s likely material. Plan for alternate communication to customers, etc., but also plan to disclose if needed.
  • Improve Internal Controls & Documentation: The SEC rules implicitly expect that companies have disclosure controls and procedures for cybersecurity info. In your SOX 302 subcertifications (if applicable), consider adding a question for business units: “Have you experienced any cybersecurity incidents or become aware of any cyber risks (including at third parties) that have not been communicated to the security team and legal?” This kind of questionnaire to division heads each quarter can surface issues that need consideration for disclosure. It creates a formal process to gather info.
  • Document any decisions not to disclose something, for example, if a minor breach occurred but was deemed not material, log that decision and the reasoning. This shows you went through a process (in case the SEC ever questions it).
  • Ensure your CFO and CEO (who sign off on 10-K/Q) are briefed on cyber risks and incidents, so they’re comfortable that the disclosures are complete. This might mean a pre-filing meeting where the CISO presents recent incidents and the status of cyber programs, and the disclosure language is reviewed in that context.
  • Training and Cultural Shift: Educate relevant employees about the SEC requirements:
  • The security team and IT should understand that even smaller incidents need to be evaluated for materiality and potentially escalated no more “let’s quietly fix this and not tell anyone.” Emphasize the importance of transparency internally; hiding or ignoring incidents can lead to big trouble.
  • The investor relations and communications teams should be looped in so they can prepare to handle questions from investors or media once disclosures happen. This is more about PR, but it affects how you craft the disclosure.
  • Middle management and operations folks should be aware that if they see something (like a key system malfunction that might be a cyber issue), they need to speak up. Basically, break any silos it’s a company-wide responsibility to surface cyber risks now.
  • Consider running a simulation specifically of the disclosure process: e.g., simulate that a ransomware attack happened, practice going through the steps of assessing impact, getting the lawyers involved, drafting the 8-K, etc., all within a couple days. This will highlight any bottlenecks or confusion to fix before a real event.
  • Monitor Regulatory Developments: The SEC rules could evolve (there was debate about aspects like what exactly to disclose in incidents). Also, other regulators (like in Europe, similar disclosure rules are being considered). Keep in touch with your legal counsel or industry associations to adapt your compliance as needed. For example, if the SEC issues FAQs or guidance on how to handle scenarios (like third-party incidents that are ongoing), update your procedures accordingly.

By following this checklist, your organization will be in a strong position to comply with the SEC’s cybersecurity disclosure requirements. It will also signal to investors that your company is serious about cybersecurity oversight, which can be a positive differentiator. Moreover, these steps align with good security practices they’re not just for show. Companies that manage these well are likely to be better prepared for cyber threats and recover faster, which in turn minimizes the likelihood of having to disclose devastating incidents.

One side benefit: through this process you might discover previously unknown vulnerabilities or gaps (maybe you find a critical supplier never underwent a security review now you will address it). Plugging those holes improves your actual security posture, not just your compliance posture.

Conclusion

The year 2025 marks a pivotal moment in the world of cybersecurity and third-party risk management. The introduction of EU DORA and NIS2, alongside the U.S. SEC’s cybersecurity disclosure rules, has fundamentally raised the expectations on organizations to manage and prove their cyber resilience, especially in relation to vendors and suppliers. What was once often an afterthought the security of third-party relationships is now front and center, with regulators demanding not just words but action: concrete contract clauses, documented risk assessments, swift incident reporting, and engaged oversight from the boardroom down to procurement.

For CROs, CISOs, and vendor risk managers, the message is clear: Third-party risk management is no longer just good hygiene, it’s a regulatory imperative. But beyond complying with the letter of the law, embracing these practices will yield tangible benefits. By rigorously assessing vendors, insisting on strong security terms, and preparing for the worst-case scenarios, organizations inherently reduce the likelihood of costly breaches and operational disruptions. In essence, good TPRM compliance = good business resilience.

Implementing the guidance in this blog from embedding key contractual protections under DORA, to conducting supply chain cyber audits for NIS2, to improving incident escalation procedures for SEC compliance will involve effort and cross-functional collaboration. It may require tough conversations with vendors, investment in new processes or tools, and internal cultural shifts toward transparency and accountability. However, these efforts are investments in trust: regulators will trust that your organization is safe to do business in the digital economy, your customers will trust that you won’t be the next headline due to a supplier’s lapse, and your stakeholders will trust management’s stewardship of risk.

A few practical takeaways to remember: Start with a Gap Analysis: Assess where your current third-party risk practices stand against each checklist. Prioritize closing the most critical gaps (e.g., if contracts are missing termination rights or if you have vendors never been assessed for security). Engage Your Vendors as Partners: Share some of these regulatory expectations with your key third parties. Many will be willing to improve once they understand the mutual benefits and obligations. For instance, discuss NIS2 with EU suppliers, they might be in scope too and could collaborate on improving security measures. Leverage Frameworks and Training: Don’t do it alone, use industry frameworks (like ISO 27001, NIST CSF) to structure your risk management. And consider building internal expertise through certifications like the C3PRMP. The Certified Third-Party Risk Management Professional training covers global best practices and regulatory requirements, equipping your team to design and run an effective TPRM program[1][2]. Graduates of such programs bring back templates, ideas, and a network of peers, which can accelerate your compliance projects. Document Everything: In the eyes of regulators, if it’s not documented, it didn’t happen. Keep thorough records, they will be your evidence of compliance and improvement over time. This also helps immensely during personnel changes or audits.

Lastly, view these regulatory changes not as a box-ticking exercise but as an opportunity to uplift your organization’s risk posture. Cyber threats will only continue to evolve, and supply chain attacks are on the rise. By implementing the steps for DORA, NIS2, and the SEC rule, you are building robust shields around your extended enterprise. You’re also sending a message to your clients and partners that your organization is a trustworthy, resilient node in the interconnected business ecosystem.

In conclusion, effective third-party risk management in 2025 and beyond will be characterized by proactivity, thoroughness, and accountability. Companies that excel in this domain will not only avoid fines and incidents but can even gain a competitive advantage, being seen as a reliable company in a world of cyber uncertainty. As you put these guidelines into practice, remember that compliance is the floor, not the ceiling. Strive for a culture where security and risk management are ingrained in every external relationship. That culture, supported by clear processes and knowledge (perhaps bolstered by programs like C3PRMP), will ensure that whether it’s a regulator’s question or a real-world cyberattack, your organization can respond with confidence and resilience.

Recent Blogs

Scroll to Top