Skip to main content

Incident Response Fundamentals › IR Readiness Self-Assessment

IR Readiness Self-Assessment

You cannot improve what you have not measured, and incident response readiness is no exception. An IR readiness self-assessment is a structured evaluation of your organisation’s ability to detect, respond to, and recover from a cyber security incident. It identifies gaps before a real incident exposes them — when the cost of discovery is a checklist, not a crisis.

Many organisations assume they are prepared because they have an IR plan document. But a plan that has never been tested, with contact details that are out of date and roles that have never been rehearsed, provides a dangerous false sense of security.

Key Assessment Areas

A comprehensive IR readiness assessment examines five domains:

  1. Plan and documentation. Does a written IR plan exist? Is it current (reviewed within the last 12 months)? Does it cover all four NIST phases? Are roles, responsibilities, and escalation paths clearly defined?
  2. People and skills. Are IR team members identified by name with deputies assigned? Have they received training? Do non-technical stakeholders (legal, communications, executive) understand their roles?
  3. Technology and tools. Do you have detection capabilities (SIEM, EDR, log monitoring)? Can you isolate compromised systems quickly? Do you have forensic tools or access to external forensic support?
  4. Communication and coordination. Are internal and external communication plans documented? Do you have out-of-band communication channels (in case email is compromised)? Are regulator and insurer contact details current?
  5. Testing and improvement. When was the IR plan last tested? Have you conducted a tabletop exercise in the past year? Are lessons from previous incidents or exercises documented and acted upon?

Diagram

IR Readiness Maturity Model — Five Assessment Domains

Radar/spider chart with five axes — Plan & Documentation, People & Skills, Technology & Tools, Communication & Coordination, Testing & Improvement — showing a sample organisation’s current maturity level versus target state.

Running the Assessment

You do not need to hire a consultant to begin. A practical approach for SMEs:

  • Assemble a small review group: Include your IT lead, a business manager, and if possible someone from legal or compliance. The assessment should not be IT-only — business perspective is essential.
  • Score each domain: Use a simple 1-5 scale (1 = no capability, 3 = basic capability with gaps, 5 = mature and tested). Be honest — an optimistic assessment defeats the purpose.
  • Identify the top three gaps: Focus on the areas that score lowest and would have the greatest impact during a real incident. Common priorities are out-of-date contact lists, untested plans, and lack of out-of-band communications.
  • Create an action plan: For each gap, define a specific action, an owner, and a target completion date. Keep it realistic — three meaningful improvements are better than twenty items that never get done.

Red Flags to Watch For

During your assessment, certain findings should raise immediate concern:

  • The IR plan has not been updated in more than two years.
  • No tabletop exercise has ever been conducted, or the last one was more than 18 months ago.
  • The IR plan relies on email for all communications — if email is compromised, the response is paralysed.
  • Nobody outside IT has seen the IR plan or knows their role in a response.
  • There is no relationship with an external IR support provider (retainer or pre-agreed terms).

Action Steps

  • Schedule a half-day IR readiness review with your IT lead and at least one business stakeholder.
  • Score your organisation against the five domains above and document the results.
  • Identify your three most critical gaps and assign owners with deadlines to address them.
  • Plan to repeat the assessment every six months to track improvement.

Quick Knowledge Check

  1. What are the five domains of an IR readiness assessment?
    Plan and documentation, people and skills, technology and tools, communication and coordination, testing and improvement.
  2. Why should the assessment not be conducted by IT alone?
    Because incident response involves business decisions, legal obligations, and communications — not just technical actions. A business perspective is essential for an accurate assessment.
  3. What is a critical red flag regarding communications?
    Relying solely on email for IR communications. If email is compromised during an incident, the entire response coordination is paralysed.