1. What Is an NDIS Incident Register?

An NDIS incident register is a centralised log that captures summary information about every incident occurring within your disability service provision. Unlike a detailed incident report — which documents the full narrative of a single event — the register provides a bird's-eye view of all incidents across your organisation in one location.

Think of the relationship between individual incident reports and the incident register as the relationship between individual bank transactions and a bank statement. The incident report captures the detail; the register captures the pattern.

The register serves three critical purposes within your compliance framework:

Every registered NDIS provider — regardless of size — must maintain an incident register. Whether you support two participants or two hundred, the NDIS Commission expects to see a functioning register that aligns with your incident management policy and procedures.

2. Why Your Incident Register Matters for Compliance

The NDIS Practice Standards Core Module establishes the requirements for incident management under Outcome 2.4. This outcome requires that providers have systems in place for preventing, identifying, recording, responding to, reporting, and learning from incidents. Your incident register sits at the centre of this system.

During a certification or verification audit, auditors will typically request your incident register within the first hour of the document review. They use it as a starting point to understand the volume, nature, and management of incidents across your service. From the register, they will select specific incidents to examine in detail — pulling the corresponding incident reports, checking timeframes, verifying follow-up actions, and cross-referencing with participant files.

Practice Standards Requirements

The specific requirements your register must address include:

For SIL providers specifically, the register takes on additional significance. Supported Independent Living environments present inherent risks — medication management, participant interactions, overnight supervision gaps, and household safety. Auditors reviewing SIL providers expect to see a register that reflects the genuine operational reality of a 24/7 support environment, not a blank or near-blank register that suggests incidents are not being captured.

Key Insight

A common misconception among new providers is that a "clean" register with few or no incidents looks good at audit. The opposite is true. Auditors are suspicious of empty registers because they suggest incidents are occurring but not being recorded. A well-populated register with documented responses and follow-up demonstrates a mature safety culture.

3. Mandatory Fields Every Incident Register Must Include

While the NDIS Commission does not prescribe a specific register template, auditors expect to see certain fields that enable systematic tracking and analysis. The following fields should be considered mandatory for any NDIS incident register:

Field Purpose Example
Incident reference number Unique identifier for cross-referencing with the detailed incident report INC-2026-0042
Date and time of incident When the incident occurred (not when it was reported) 06/04/2026, 14:30
Date and time reported When the incident was reported internally — enables timeframe compliance checks 06/04/2026, 15:15
Location Where the incident occurred (SIL house, community, day program, etc.) SIL House — 42 Elm St, Bendigo
Participant(s) involved Name and NDIS number of affected participant(s) Jane Doe — NDIS 431 234 567
Staff involved Name and role of staff present or involved Sarah Smith — Support Worker
Incident category Classification of the incident type for trending purposes Medication error
Severity rating Consistent severity scale applied across all incidents Minor / Moderate / Major / Critical
Brief description One to two sentence summary of what occurred Participant missed 8am prescribed medication due to shift changeover miscommunication
Immediate actions taken What was done at the time of the incident Medication administered at 9:15am. GP contacted. No adverse effects observed.
Reportable incident? Yes/No flag indicating whether the incident met reportable thresholds No
NDIS Commission notification date If reportable, the date the notification was submitted N/A
Follow-up actions required Corrective or preventive actions identified Review shift handover procedure. Retrain staff on medication administration at changeover.
Person responsible for follow-up Named individual accountable for completing follow-up House Coordinator — Michael Chen
Follow-up completion date Date the follow-up actions were completed 12/04/2026
Status Current status of the incident (Open / In Progress / Closed) Closed
Reviewed by Manager or key personnel who reviewed the incident Operations Manager — 08/04/2026

Incident Categories

Consistent categorisation is essential for meaningful trend analysis. Establish a fixed set of categories and use them consistently across all entries. Common categories for NDIS providers include:

Severity Rating Scale

Apply a consistent severity scale across your register. A four-level scale is standard:

Rating Definition Example
1 — Minor No harm occurred. Minor impact on service delivery. Resolved with routine response. Participant missed one meal due to shopping delivery delay.
2 — Moderate Minor harm occurred or could have occurred. Required non-routine response. Participant sustained a minor bruise from a fall. First aid administered.
3 — Major Significant harm or potential for significant harm. Required external intervention. Participant required hospital treatment after medication error.
4 — Critical Serious injury, death, abuse, or use of unauthorised restrictive practice. Likely reportable to NDIS Commission. Participant sustained a fracture requiring surgical intervention.

4. Linking to Reportable Incidents

One of the most important functions of your incident register is flagging incidents that meet the threshold for reporting to the NDIS Quality and Safeguards Commission. Under the NDIS (Incident Management and Reportable Incidents) Rules 2018, the following categories of incidents are reportable:

Your incident register must include fields that clearly identify whether each incident is reportable, and if so, document the notification details. At minimum, you should capture:

Important

The 24-hour timeframe for Priority 1 reportable incidents runs from when any person within your organisation becomes aware of the incident — not from when management is notified. Ensure your staff understand their obligation to report incidents internally immediately so that your organisation can meet the NDIS Commission notification timeframe. Document these internal reporting timeframes in your register.

Linking the Register to Your Incident Management Policy

Your incident register should be explicitly referenced in your incident management policy. The policy should describe how the register is used, who is responsible for maintaining it, who has access, and how it connects to your continuous improvement and risk management systems. During audit, assessors will cross-reference your policy with your register to ensure alignment.

The register should also link outward to other compliance documents:

5. Electronic vs Paper Registers

The NDIS Commission does not mandate a specific format for your incident register. Both electronic and paper-based registers are acceptable, provided they meet the core requirements of completeness, accuracy, security, and accessibility. However, each format has distinct advantages and limitations that small providers should consider carefully.

Paper Registers

Advantages:

Limitations:

If you use a paper register, ensure entries are made in ink (never pencil), pages are sequentially numbered, and the register is stored in a locked location with controlled access. Do not use loose-leaf binders — use a bound book or permanently bound printed template to prevent page removal.

Electronic Registers

Advantages:

Limitations:

For most small NDIS providers, a spreadsheet-based register (Microsoft Excel or Google Sheets) provides the best balance of functionality and simplicity. Use data validation for dropdown fields (incident category, severity, status, reportable flag) and conditional formatting to highlight overdue follow-up actions or open incidents.

Need an Audit-Ready Incident Register Template?

The SIL Rescue Kit includes a pre-built incident register with all mandatory fields, dropdown categories, severity ratings, and reportable incident tracking — ready to customise with your organisation details.

Get the SIL Rescue Kit — $297

6. Register Review Schedule

Maintaining an incident register is not a "set and forget" task. The NDIS Practice Standards require that providers review incident data regularly and use it to drive improvements. Auditors will look for documented evidence that reviews occur at defined intervals.

Recommended Review Frequency

Review Type Frequency Purpose Documented By
Individual incident review Within 48 hours of each incident Verify accuracy of entry, assess severity, determine if reportable, assign follow-up actions Manager or designated incident reviewer
Weekly check Weekly Ensure all incidents from the past week are recorded, check for overdue follow-up actions Team leader or coordinator
Monthly summary Monthly Count incidents by category and severity, identify emerging patterns, compare to previous months Operations manager or quality lead
Quarterly analysis Every 3 months Comprehensive trend analysis, comparison across locations and categories, identify systemic issues, update risk register Key personnel or governance body
Annual review Annually Full-year analysis, comparison to previous years, inform strategic planning, review and update incident management policy Key personnel and governance body

Documenting Reviews

Each review should be documented with:

These review records can be appended to your register as a separate review log tab (if electronic) or maintained in a separate review register that cross-references the incident register.

The true value of your incident register extends far beyond compliance documentation. When used effectively, the register becomes a powerful quality improvement tool that helps you prevent future incidents and improve participant outcomes.

What to Analyse

Effective incident trend analysis examines multiple dimensions of your data:

Presenting Trend Data

For quarterly and annual reviews, present your analysis in a format that is accessible to all stakeholders. Simple approaches include:

You do not need sophisticated software for this. Microsoft Excel or Google Sheets can generate all of these visualisations from your register data. The key is consistency — use the same format each review period so that trends are visible over time.

Closing the Loop: From Analysis to Action

Trend analysis is only valuable if it leads to action. For every significant trend you identify, follow this cycle:

  1. Identify the trend (e.g., medication errors at shift changeover have increased by 40% this quarter).
  2. Investigate root causes (e.g., handover procedure does not include medication check, agency staff are unfamiliar with participant medication schedules).
  3. Develop corrective actions (e.g., add mandatory medication handover checklist, require agency staff to complete medication orientation before independent shifts).
  4. Implement and document (update procedures, deliver training, record in continuous improvement register).
  5. Monitor effectiveness (track the same incident category in subsequent quarters to verify the corrective action is reducing occurrences).

This cycle is exactly what auditors look for — evidence that your organisation does not just record incidents but actively uses incident data to improve safety and service quality. Document each step so the audit trail is clear.

8. Common Audit Findings Related to Incident Registers

Understanding what auditors commonly identify as non-conformances helps you avoid the same pitfalls. Based on publicly available NDIS Commission audit outcome data and practitioner experience, the most common incident register findings include:

Finding 1: Incomplete Register Entries

The most frequent issue is missing fields. Auditors find entries where the severity rating is blank, follow-up actions are not recorded, the status is permanently "Open," or the date reported differs significantly from the date of the incident without explanation. Every field in your register should be completed for every entry — no exceptions.

Finding 2: No Evidence of Register Review

Providers maintain the register but cannot demonstrate that anyone reviews it. There are no review dates, no reviewer signatures, no trend analysis documents, and no connection between incident data and improvement actions. Without documented reviews, the register is just a log — not an incident management system.

Finding 3: Inconsistent Categorisation

Staff use different terms for the same type of incident (e.g., "fall," "trip," "slip," "loss of balance" all refer to falls but appear as separate categories). This makes trend analysis meaningless. Use a fixed dropdown list of categories and train staff to categorise consistently.

Finding 4: No Linkage to Reportable Incidents

The register does not clearly indicate which incidents are reportable, or there are incidents that meet reportable thresholds but are not flagged. Auditors will cross-reference your register against the reportable incident criteria and expect to find appropriate flagging. If an incident involving serious injury is recorded but not flagged as reportable, this is a significant non-conformance.

Finding 5: No Connection to Continuous Improvement

The register exists in isolation. There is no evidence that incident trends feed into the continuous improvement register, risk register, or governance reporting. The NDIS Practice Standards explicitly require that providers learn from incidents. A register that does not connect to your improvement systems fails to meet this requirement.

Finding 6: Register Does Not Align with Policy

Your incident management policy describes one process, but the register reflects a different one. For example, the policy states that the Operations Manager reviews all incidents within 24 hours, but the register shows no reviewer column or reviews happening weeks after incidents. Ensure your policy and register are aligned.

Finding 7: Suspiciously Low Incident Numbers

As mentioned earlier, a near-empty register raises red flags. For a SIL provider supporting multiple participants 24/7, auditors expect to see a reasonable volume of incidents including near misses, minor events, and medication-related entries. Zero or very few incidents over a 12-month period suggests under-reporting, not good practice.

Auditor Tip

When preparing for audit, review your register from the auditor's perspective. Select five random incidents from your register and pull the corresponding incident reports. Can you locate each report easily? Do the details match? Are follow-up actions completed and documented? Is there evidence of management review? This self-audit exercise will reveal gaps before the auditor finds them.

9. Step-by-Step: Building Your Incident Register

If you are building an incident register from scratch, follow these steps to create a compliant and functional register:

Step 1: Choose Your Format

For most small providers, a Microsoft Excel or Google Sheets spreadsheet is the ideal starting point. It provides the searchability and analysis capability of an electronic register without the cost of dedicated software. If you prefer paper, use a bound register book with pre-printed column headers.

Step 2: Set Up Your Fields

Create columns for every mandatory field listed in Section 3 above. Use the first row as a header row with clear, concise column names. Freeze the header row so it remains visible as you scroll through entries.

Step 3: Configure Data Validation

For electronic registers, set up data validation rules for consistency:

Step 4: Create Your Reference Number System

Establish a consistent numbering system. A recommended format is: INC-[YEAR]-[SEQUENTIAL NUMBER] (e.g., INC-2026-001). This provides a unique identifier for each incident that can be cross-referenced with the detailed incident report. If you have multiple locations, you may add a location prefix (e.g., INC-BEN-2026-001 for Bendigo).

Step 5: Add Conditional Formatting

For electronic registers, add conditional formatting to highlight:

Step 6: Create a Review Log Tab

Add a separate worksheet tab for documenting register reviews. This tab should capture the date, reviewer name, review type (weekly/monthly/quarterly/annual), summary of findings, trends identified, and actions arising.

Step 7: Set Access Controls

Determine who needs access to the register and at what level. Typically:

Step 8: Train Your Staff

A register is only as good as the people using it. Train all staff on:

Record this training in your training register — auditors may check that staff have been trained in incident reporting procedures.

Step 9: Test with Sample Data

Before going live, enter five to ten sample incidents covering different categories and severities. Review the sample data to ensure all fields work correctly, categories are appropriate, and the register is easy to use. Adjust as needed before deploying to staff.

10. Complete Register Template Fields Reference

The following is a comprehensive reference of all fields your incident register should include, organised by section. Use this as a checklist when building or reviewing your register.

Incident Identification

Incident Details

Response and Actions

Reportable Incident Fields

Follow-Up and Closure

Not every incident will require every field to be populated (for example, a non-reportable incident will not need Commission notification fields). However, the fields should exist in your register so they are available when needed, and non-applicable fields should be marked "N/A" rather than left blank.

Get All Your Registers Sorted in One Package

The SIL Rescue Kit includes 10 pre-built registers including the incident register, complaints register, risk register, training register, and worker screening register — all mapped to NDIS Practice Standards and ready for your certification audit.

Get the SIL Rescue Kit — $297

Summary

Your NDIS incident register is far more than an administrative obligation. It is the central nervous system of your incident management framework — connecting individual incidents to systemic analysis, reportable incident obligations, continuous improvement actions, and risk management. A well-maintained register demonstrates to auditors that your organisation takes participant safety seriously and has the systems in place to learn from every event.

The key principles to remember are: record every incident consistently and completely, review the register at defined intervals, analyse trends quarterly, link findings to your continuous improvement and risk registers, and ensure the register aligns with your incident management policy. If you do these things, your incident register will serve you well at audit and — more importantly — help you deliver safer, better services to your participants.

For support workers needing to write compliant shift notes after incidents, our free NDIS Notes Rewriter tool can help ensure your documentation meets NDIS standards. And if you are preparing for your SIL certification audit, the SIL Rescue Kit provides all 10 registers, 25 policies, and 25 forms you need — ready to customise and deploy.

Important: This article provides general guidance about NDIS compliance requirements. It is not legal or professional advice. Requirements may change as the NDIS Commission updates its policies and Practice Standards. Always verify current requirements with the NDIS Quality and Safeguards Commission or a registered NDIS consultant before making compliance decisions.