1. What Is an NDIS Incident Register?
An NDIS incident register is a centralised log that captures summary information about every incident occurring within your disability service provision. Unlike a detailed incident report — which documents the full narrative of a single event — the register provides a bird's-eye view of all incidents across your organisation in one location.
Think of the relationship between individual incident reports and the incident register as the relationship between individual bank transactions and a bank statement. The incident report captures the detail; the register captures the pattern.
The register serves three critical purposes within your compliance framework:
- Compliance evidence: It demonstrates to auditors that you have a systematic approach to recording, tracking, and managing incidents as required under the NDIS Practice Standards Core Module, particularly Outcome 2.4 (Incident Management).
- Quality improvement: By capturing incidents in a structured format, you can analyse trends over time, identify recurring issues, and implement preventive measures through your continuous improvement system.
- Accountability and governance: The register creates a verifiable record that your organisation responds to incidents appropriately, within required timeframes, and with documented follow-up actions.
Every registered NDIS provider — regardless of size — must maintain an incident register. Whether you support two participants or two hundred, the NDIS Commission expects to see a functioning register that aligns with your incident management policy and procedures.
2. Why Your Incident Register Matters for Compliance
The NDIS Practice Standards Core Module establishes the requirements for incident management under Outcome 2.4. This outcome requires that providers have systems in place for preventing, identifying, recording, responding to, reporting, and learning from incidents. Your incident register sits at the centre of this system.
During a certification or verification audit, auditors will typically request your incident register within the first hour of the document review. They use it as a starting point to understand the volume, nature, and management of incidents across your service. From the register, they will select specific incidents to examine in detail — pulling the corresponding incident reports, checking timeframes, verifying follow-up actions, and cross-referencing with participant files.
Practice Standards Requirements
The specific requirements your register must address include:
- Recording: Each incident must be recorded with sufficient detail to understand the nature, severity, and persons involved (NDIS Practice Standards, Core Module 2.4.1).
- Response tracking: The register must demonstrate that each incident received a proportionate response, including immediate actions and longer-term corrective measures (Core Module 2.4.2).
- Reporting linkage: Where an incident meets the threshold for a reportable incident, the register must clearly flag this and record that the notification was made to the NDIS Commission (Core Module 2.4.3).
- Review and learning: The register must be reviewed regularly, with documented evidence that incident data is analysed for trends and that findings inform service improvements (Core Module 2.4.5).
For SIL providers specifically, the register takes on additional significance. Supported Independent Living environments present inherent risks — medication management, participant interactions, overnight supervision gaps, and household safety. Auditors reviewing SIL providers expect to see a register that reflects the genuine operational reality of a 24/7 support environment, not a blank or near-blank register that suggests incidents are not being captured.
A common misconception among new providers is that a "clean" register with few or no incidents looks good at audit. The opposite is true. Auditors are suspicious of empty registers because they suggest incidents are occurring but not being recorded. A well-populated register with documented responses and follow-up demonstrates a mature safety culture.
3. Mandatory Fields Every Incident Register Must Include
While the NDIS Commission does not prescribe a specific register template, auditors expect to see certain fields that enable systematic tracking and analysis. The following fields should be considered mandatory for any NDIS incident register:
| Field | Purpose | Example |
|---|---|---|
| Incident reference number | Unique identifier for cross-referencing with the detailed incident report | INC-2026-0042 |
| Date and time of incident | When the incident occurred (not when it was reported) | 06/04/2026, 14:30 |
| Date and time reported | When the incident was reported internally — enables timeframe compliance checks | 06/04/2026, 15:15 |
| Location | Where the incident occurred (SIL house, community, day program, etc.) | SIL House — 42 Elm St, Bendigo |
| Participant(s) involved | Name and NDIS number of affected participant(s) | Jane Doe — NDIS 431 234 567 |
| Staff involved | Name and role of staff present or involved | Sarah Smith — Support Worker |
| Incident category | Classification of the incident type for trending purposes | Medication error |
| Severity rating | Consistent severity scale applied across all incidents | Minor / Moderate / Major / Critical |
| Brief description | One to two sentence summary of what occurred | Participant missed 8am prescribed medication due to shift changeover miscommunication |
| Immediate actions taken | What was done at the time of the incident | Medication administered at 9:15am. GP contacted. No adverse effects observed. |
| Reportable incident? | Yes/No flag indicating whether the incident met reportable thresholds | No |
| NDIS Commission notification date | If reportable, the date the notification was submitted | N/A |
| Follow-up actions required | Corrective or preventive actions identified | Review shift handover procedure. Retrain staff on medication administration at changeover. |
| Person responsible for follow-up | Named individual accountable for completing follow-up | House Coordinator — Michael Chen |
| Follow-up completion date | Date the follow-up actions were completed | 12/04/2026 |
| Status | Current status of the incident (Open / In Progress / Closed) | Closed |
| Reviewed by | Manager or key personnel who reviewed the incident | Operations Manager — 08/04/2026 |
Incident Categories
Consistent categorisation is essential for meaningful trend analysis. Establish a fixed set of categories and use them consistently across all entries. Common categories for NDIS providers include:
- Medication error (missed dose, wrong dose, wrong medication, wrong participant)
- Injury — participant (fall, self-harm, accident)
- Injury — staff (manual handling, workplace injury)
- Behaviour of concern (aggression, self-injurious behaviour, property damage)
- Abuse, neglect, or exploitation (suspected or confirmed)
- Missing or absent participant
- Property damage or theft
- Service delivery failure (missed shift, transport failure, service disruption)
- Environmental hazard (fire, flood, structural issue, infection control breach)
- Use of restrictive practice (authorised or unauthorised)
- Privacy or confidentiality breach
- Vehicle incident
- Near miss (an event that could have resulted in harm but did not)
Severity Rating Scale
Apply a consistent severity scale across your register. A four-level scale is standard:
| Rating | Definition | Example |
|---|---|---|
| 1 — Minor | No harm occurred. Minor impact on service delivery. Resolved with routine response. | Participant missed one meal due to shopping delivery delay. |
| 2 — Moderate | Minor harm occurred or could have occurred. Required non-routine response. | Participant sustained a minor bruise from a fall. First aid administered. |
| 3 — Major | Significant harm or potential for significant harm. Required external intervention. | Participant required hospital treatment after medication error. |
| 4 — Critical | Serious injury, death, abuse, or use of unauthorised restrictive practice. Likely reportable to NDIS Commission. | Participant sustained a fracture requiring surgical intervention. |
4. Linking to Reportable Incidents
One of the most important functions of your incident register is flagging incidents that meet the threshold for reporting to the NDIS Quality and Safeguards Commission. Under the NDIS (Incident Management and Reportable Incidents) Rules 2018, the following categories of incidents are reportable:
- The death of a participant
- Serious injury of a participant
- Abuse or neglect of a participant
- Unlawful sexual or physical contact with, or assault of, a participant
- Sexual misconduct committed against, or in the presence of, a participant
- Use of a restrictive practice in relation to a participant that is not in accordance with an authorisation or where no authorisation exists
Your incident register must include fields that clearly identify whether each incident is reportable, and if so, document the notification details. At minimum, you should capture:
- Reportable incident flag: A clear Yes/No field.
- Priority classification: Priority 1 (must notify within 24 hours) or Priority 2 (must notify within 5 business days).
- NDIS Commission notification date: The date the initial notification was submitted via the NDIS Commission portal.
- Commission reference number: The reference number assigned by the Commission upon receipt of your notification.
- Final report due date: 28 calendar days from the date the provider became aware of the incident.
- Final report submitted: Date the final incident report was submitted to the Commission.
The 24-hour timeframe for Priority 1 reportable incidents runs from when any person within your organisation becomes aware of the incident — not from when management is notified. Ensure your staff understand their obligation to report incidents internally immediately so that your organisation can meet the NDIS Commission notification timeframe. Document these internal reporting timeframes in your register.
Linking the Register to Your Incident Management Policy
Your incident register should be explicitly referenced in your incident management policy. The policy should describe how the register is used, who is responsible for maintaining it, who has access, and how it connects to your continuous improvement and risk management systems. During audit, assessors will cross-reference your policy with your register to ensure alignment.
The register should also link outward to other compliance documents:
- Continuous improvement register: Systemic issues identified through incident trending should generate entries in your CI register with specific improvement actions.
- Risk register: Recurring incident categories may indicate risks that need to be added to or escalated within your risk register.
- Training register: Where incidents reveal competency gaps, corresponding training entries should appear in your training register.
- Participant support plans: Individual participant incidents may trigger reviews of support plans, risk assessments, or behaviour support plans.
5. Electronic vs Paper Registers
The NDIS Commission does not mandate a specific format for your incident register. Both electronic and paper-based registers are acceptable, provided they meet the core requirements of completeness, accuracy, security, and accessibility. However, each format has distinct advantages and limitations that small providers should consider carefully.
Paper Registers
Advantages:
- No technology costs or learning curve
- Immediately available in emergencies (no power or internet required)
- Familiar format for staff with limited digital skills
Limitations:
- Difficult to search, sort, or filter entries
- Trending and analysis must be done manually
- Risk of loss, damage, or unauthorised access
- Cannot be simultaneously accessed by multiple people
- Harder to demonstrate version control and audit trail
If you use a paper register, ensure entries are made in ink (never pencil), pages are sequentially numbered, and the register is stored in a locked location with controlled access. Do not use loose-leaf binders — use a bound book or permanently bound printed template to prevent page removal.
Electronic Registers
Advantages:
- Searchable and sortable by any field
- Automatic date and time stamping
- Built-in data validation (e.g., mandatory fields, dropdown categories)
- Easy to generate reports, charts, and trend analysis
- Can be backed up automatically
- Access controls can restrict who can view, add, or modify entries
- Multiple users can access simultaneously
Limitations:
- Requires staff training and technology access
- Dependent on power and internet (for cloud-based systems)
- Must ensure data security and privacy compliance
- Audit trail must be deliberately configured (track changes, edit history)
For most small NDIS providers, a spreadsheet-based register (Microsoft Excel or Google Sheets) provides the best balance of functionality and simplicity. Use data validation for dropdown fields (incident category, severity, status, reportable flag) and conditional formatting to highlight overdue follow-up actions or open incidents.
Need an Audit-Ready Incident Register Template?
The SIL Rescue Kit includes a pre-built incident register with all mandatory fields, dropdown categories, severity ratings, and reportable incident tracking — ready to customise with your organisation details.
Get the SIL Rescue Kit — $2976. Register Review Schedule
Maintaining an incident register is not a "set and forget" task. The NDIS Practice Standards require that providers review incident data regularly and use it to drive improvements. Auditors will look for documented evidence that reviews occur at defined intervals.
Recommended Review Frequency
| Review Type | Frequency | Purpose | Documented By |
|---|---|---|---|
| Individual incident review | Within 48 hours of each incident | Verify accuracy of entry, assess severity, determine if reportable, assign follow-up actions | Manager or designated incident reviewer |
| Weekly check | Weekly | Ensure all incidents from the past week are recorded, check for overdue follow-up actions | Team leader or coordinator |
| Monthly summary | Monthly | Count incidents by category and severity, identify emerging patterns, compare to previous months | Operations manager or quality lead |
| Quarterly analysis | Every 3 months | Comprehensive trend analysis, comparison across locations and categories, identify systemic issues, update risk register | Key personnel or governance body |
| Annual review | Annually | Full-year analysis, comparison to previous years, inform strategic planning, review and update incident management policy | Key personnel and governance body |
Documenting Reviews
Each review should be documented with:
- Date of review
- Name and role of the reviewer
- Summary of findings (e.g., "14 incidents recorded in March 2026 — 6 medication-related, 4 behavioural, 2 falls, 2 service delivery")
- Trends identified (e.g., "Medication errors have increased 50% compared to the previous quarter, primarily occurring at shift changeover")
- Actions arising from the review (e.g., "Implement double-check medication handover procedure — training scheduled for 15 April 2026")
- Linkage to continuous improvement register (e.g., "CI register entry CI-2026-008 created for medication handover improvement project")
These review records can be appended to your register as a separate review log tab (if electronic) or maintained in a separate review register that cross-references the incident register.
7. Trending and Analysis: Turning Data Into Improvement
The true value of your incident register extends far beyond compliance documentation. When used effectively, the register becomes a powerful quality improvement tool that helps you prevent future incidents and improve participant outcomes.
What to Analyse
Effective incident trend analysis examines multiple dimensions of your data:
- Volume trends: Is the total number of incidents increasing, decreasing, or stable over time? A sudden spike may indicate a new risk factor. A gradual decline (with no change in reporting culture) may indicate that improvement initiatives are working.
- Category distribution: Which incident categories are most common? Are certain categories increasing while others decrease?
- Severity distribution: What proportion of incidents are minor vs moderate vs major? A shift toward higher severity incidents warrants immediate attention.
- Location patterns: Are certain SIL houses or service locations experiencing more incidents than others? This may point to environmental factors, staffing issues, or participant-specific risks.
- Time patterns: Do incidents cluster around certain times of day (e.g., shift changeover, overnight, mealtimes), days of the week, or periods of the year?
- Staff patterns: Are certain staff members involved in a disproportionate number of incidents? This may indicate training needs, fatigue, or performance issues — not necessarily fault.
- Participant patterns: Are certain participants experiencing recurrent incidents? This should trigger a review of their support plan, risk assessment, and (where relevant) behaviour support plan.
- Response effectiveness: Are corrective actions reducing the recurrence of specific incident types? If the same type of incident keeps recurring despite corrective actions, the actions themselves may need to be reassessed.
Presenting Trend Data
For quarterly and annual reviews, present your analysis in a format that is accessible to all stakeholders. Simple approaches include:
- Bar charts showing incident counts by category per month
- Pie charts showing severity distribution
- Line graphs showing incident volume trends over 12 months
- Heat maps showing incidents by time of day and day of week
- Tables comparing current quarter to previous quarter with percentage change
You do not need sophisticated software for this. Microsoft Excel or Google Sheets can generate all of these visualisations from your register data. The key is consistency — use the same format each review period so that trends are visible over time.
Closing the Loop: From Analysis to Action
Trend analysis is only valuable if it leads to action. For every significant trend you identify, follow this cycle:
- Identify the trend (e.g., medication errors at shift changeover have increased by 40% this quarter).
- Investigate root causes (e.g., handover procedure does not include medication check, agency staff are unfamiliar with participant medication schedules).
- Develop corrective actions (e.g., add mandatory medication handover checklist, require agency staff to complete medication orientation before independent shifts).
- Implement and document (update procedures, deliver training, record in continuous improvement register).
- Monitor effectiveness (track the same incident category in subsequent quarters to verify the corrective action is reducing occurrences).
This cycle is exactly what auditors look for — evidence that your organisation does not just record incidents but actively uses incident data to improve safety and service quality. Document each step so the audit trail is clear.
8. Common Audit Findings Related to Incident Registers
Understanding what auditors commonly identify as non-conformances helps you avoid the same pitfalls. Based on publicly available NDIS Commission audit outcome data and practitioner experience, the most common incident register findings include:
Finding 1: Incomplete Register Entries
The most frequent issue is missing fields. Auditors find entries where the severity rating is blank, follow-up actions are not recorded, the status is permanently "Open," or the date reported differs significantly from the date of the incident without explanation. Every field in your register should be completed for every entry — no exceptions.
Finding 2: No Evidence of Register Review
Providers maintain the register but cannot demonstrate that anyone reviews it. There are no review dates, no reviewer signatures, no trend analysis documents, and no connection between incident data and improvement actions. Without documented reviews, the register is just a log — not an incident management system.
Finding 3: Inconsistent Categorisation
Staff use different terms for the same type of incident (e.g., "fall," "trip," "slip," "loss of balance" all refer to falls but appear as separate categories). This makes trend analysis meaningless. Use a fixed dropdown list of categories and train staff to categorise consistently.
Finding 4: No Linkage to Reportable Incidents
The register does not clearly indicate which incidents are reportable, or there are incidents that meet reportable thresholds but are not flagged. Auditors will cross-reference your register against the reportable incident criteria and expect to find appropriate flagging. If an incident involving serious injury is recorded but not flagged as reportable, this is a significant non-conformance.
Finding 5: No Connection to Continuous Improvement
The register exists in isolation. There is no evidence that incident trends feed into the continuous improvement register, risk register, or governance reporting. The NDIS Practice Standards explicitly require that providers learn from incidents. A register that does not connect to your improvement systems fails to meet this requirement.
Finding 6: Register Does Not Align with Policy
Your incident management policy describes one process, but the register reflects a different one. For example, the policy states that the Operations Manager reviews all incidents within 24 hours, but the register shows no reviewer column or reviews happening weeks after incidents. Ensure your policy and register are aligned.
Finding 7: Suspiciously Low Incident Numbers
As mentioned earlier, a near-empty register raises red flags. For a SIL provider supporting multiple participants 24/7, auditors expect to see a reasonable volume of incidents including near misses, minor events, and medication-related entries. Zero or very few incidents over a 12-month period suggests under-reporting, not good practice.
When preparing for audit, review your register from the auditor's perspective. Select five random incidents from your register and pull the corresponding incident reports. Can you locate each report easily? Do the details match? Are follow-up actions completed and documented? Is there evidence of management review? This self-audit exercise will reveal gaps before the auditor finds them.
9. Step-by-Step: Building Your Incident Register
If you are building an incident register from scratch, follow these steps to create a compliant and functional register:
Step 1: Choose Your Format
For most small providers, a Microsoft Excel or Google Sheets spreadsheet is the ideal starting point. It provides the searchability and analysis capability of an electronic register without the cost of dedicated software. If you prefer paper, use a bound register book with pre-printed column headers.
Step 2: Set Up Your Fields
Create columns for every mandatory field listed in Section 3 above. Use the first row as a header row with clear, concise column names. Freeze the header row so it remains visible as you scroll through entries.
Step 3: Configure Data Validation
For electronic registers, set up data validation rules for consistency:
- Incident category: Dropdown list with your fixed categories
- Severity: Dropdown list (Minor / Moderate / Major / Critical)
- Status: Dropdown list (Open / In Progress / Closed)
- Reportable incident: Dropdown list (Yes / No)
- Date fields: Date format validation (DD/MM/YYYY)
Step 4: Create Your Reference Number System
Establish a consistent numbering system. A recommended format is: INC-[YEAR]-[SEQUENTIAL NUMBER] (e.g., INC-2026-001). This provides a unique identifier for each incident that can be cross-referenced with the detailed incident report. If you have multiple locations, you may add a location prefix (e.g., INC-BEN-2026-001 for Bendigo).
Step 5: Add Conditional Formatting
For electronic registers, add conditional formatting to highlight:
- Critical severity incidents in red
- Open incidents older than 7 days in yellow
- Reportable incidents in orange
- Overdue follow-up actions in red
Step 6: Create a Review Log Tab
Add a separate worksheet tab for documenting register reviews. This tab should capture the date, reviewer name, review type (weekly/monthly/quarterly/annual), summary of findings, trends identified, and actions arising.
Step 7: Set Access Controls
Determine who needs access to the register and at what level. Typically:
- Add new entries: All direct support staff, team leaders, coordinators
- Edit existing entries: Team leaders, coordinators, managers only
- Review and close: Managers and key personnel only
- Delete entries: No one (entries should never be deleted — void them with an explanation if entered in error)
Step 8: Train Your Staff
A register is only as good as the people using it. Train all staff on:
- When to record an incident (threshold for entry)
- How to access and use the register
- Which category and severity to assign (with examples)
- The timeframe for recording incidents (same shift or within 24 hours)
- The difference between the register entry and the detailed incident report (both are required)
Record this training in your training register — auditors may check that staff have been trained in incident reporting procedures.
Step 9: Test with Sample Data
Before going live, enter five to ten sample incidents covering different categories and severities. Review the sample data to ensure all fields work correctly, categories are appropriate, and the register is easy to use. Adjust as needed before deploying to staff.
10. Complete Register Template Fields Reference
The following is a comprehensive reference of all fields your incident register should include, organised by section. Use this as a checklist when building or reviewing your register.
Incident Identification
- Incident reference number (unique sequential ID)
- Date and time of incident
- Date and time reported internally
- Person reporting the incident (name and role)
Incident Details
- Location of incident
- Participant(s) involved (name and NDIS number)
- Staff involved (name and role)
- Other persons involved (visitors, family, other participants)
- Incident category (from fixed list)
- Severity rating (Minor / Moderate / Major / Critical)
- Brief description of the incident (1-2 sentences)
Response and Actions
- Immediate actions taken
- External services contacted (ambulance, police, fire, GP)
- Participant outcome (no injury, first aid, medical treatment, hospitalisation)
- Family or guardian notified (date and method)
Reportable Incident Fields
- Is this a reportable incident? (Yes / No)
- Reportable incident category
- Priority classification (Priority 1 / Priority 2)
- NDIS Commission notification date
- Commission reference number
- Final report due date
- Final report submission date
Follow-Up and Closure
- Follow-up actions required (specific, measurable actions)
- Person responsible for follow-up
- Target completion date for follow-up
- Actual completion date
- Status (Open / In Progress / Closed)
- Reviewed by (name, role, date)
- Linked CI register entry (if applicable)
- Linked risk register entry (if applicable)
Not every incident will require every field to be populated (for example, a non-reportable incident will not need Commission notification fields). However, the fields should exist in your register so they are available when needed, and non-applicable fields should be marked "N/A" rather than left blank.
Get All Your Registers Sorted in One Package
The SIL Rescue Kit includes 10 pre-built registers including the incident register, complaints register, risk register, training register, and worker screening register — all mapped to NDIS Practice Standards and ready for your certification audit.
Get the SIL Rescue Kit — $297Summary
Your NDIS incident register is far more than an administrative obligation. It is the central nervous system of your incident management framework — connecting individual incidents to systemic analysis, reportable incident obligations, continuous improvement actions, and risk management. A well-maintained register demonstrates to auditors that your organisation takes participant safety seriously and has the systems in place to learn from every event.
The key principles to remember are: record every incident consistently and completely, review the register at defined intervals, analyse trends quarterly, link findings to your continuous improvement and risk registers, and ensure the register aligns with your incident management policy. If you do these things, your incident register will serve you well at audit and — more importantly — help you deliver safer, better services to your participants.
For support workers needing to write compliant shift notes after incidents, our free NDIS Notes Rewriter tool can help ensure your documentation meets NDIS standards. And if you are preparing for your SIL certification audit, the SIL Rescue Kit provides all 10 registers, 25 policies, and 25 forms you need — ready to customise and deploy.
Important: This article provides general guidance about NDIS compliance requirements. It is not legal or professional advice. Requirements may change as the NDIS Commission updates its policies and Practice Standards. Always verify current requirements with the NDIS Quality and Safeguards Commission or a registered NDIS consultant before making compliance decisions.