This document outlines the identification, communication and management of outlying organisations (both stroke teams and CCGs/LHBs) in the SSNAP 30-day casemix adjusted mortality funnel based on annual financial year data. This policy is based on the DH/HQIP guidance "Detection and Management of Outliers 2011, updated January 2020".
- Data Quality:
a) Data Accuracy:
SSNAP employs a proactive approach to the management of data quality in order to ensure that all reporting can be undertaken with a very fast turnaround. In order to achieve this, the lead clinician at every stroke team must have clinical oversight and sign off the data as correct before each data deadline. This clinical sign off ensures that the data have been checked and approved for analysis ahead of time, and is enacted by a process of "locking" the record. Only locked records are therefore analysed. This eliminates the need for any discussion or debate with teams after the deadline about whether the records need amending. No amendments are possible after the data deadline, and this process is now well established and understood by teams. In addition to this, SSNAP publically reports the breakdown of each team’s casemix, and comparisons to the national casemix, every 3-4 months. Teams and clinicians are encouraged to review this information using a specifically designed "casemix tool" in order to identify any issues with casemix data quality early, and correct this information before the annual data deadline. In addition, the casemix models used in the SSNAP mortality reporting are peer-reviewed models published in Stroke, which has been well publicised to all teams. Teams are therefore aware of the need to accurately report these data items.
b) Case Ascertainment:
Case ascertainment is determined by comparison with estimates from previous HES/PEDW figures, updated by teams supported by documentation of clinical coding where necessary. Teams are penalised for low case ascertainment in their regular reporting; this has led to very high case ascertainment – for example in 2015-16 92% of routinely admitting teams in England and Wales submitted over 90% of their stroke cases to SSNAP, and all of the remaining routinely admitting teams submitted 80-89%. The case ascertainment for each team is always reported alongside mortality results.
c) Data Completeness:
For the outcome variable, death within 30 days of hospital arrival (or onset if onset in hospital), linkage to the Office of National Statistics (ONS) register of deaths is used to ensure completeness, as well as in-hospital deaths reported on SSNAP. The only casemix variable used which is non-mandatory is the full National Institutes of Health Stroke Scale (NIHSS) on arrival. All other variables are fully complete for all patients. The SSNAP mortality report uses two models; one for patients with a fully completed NIHSS, and another for patients where the NIHSS is incomplete – where it is incomplete, the model utilises the level of consciousness (mandatory variable) instead. The NIHSS was fully completed for 86.7% of patients nationally, and a range of initiatives are in place to encourage teams to complete this for even more patients. The percentage of patients at each team with a fully completed NIHSS score is also reported alongside the mortality results.
- Casemix Adjustment:
The SSNAP mortality models adjust for important casemix variables; patient age, stroke type, diagnosis of Atrial Fibrillation (AF) prior to stroke, and stroke severity (either NIHSS on arrival or, if incomplete, level of consciousness on arrival).The variables in the model were determined by backwards elimination of factors, and validated both internally and externally using a population-based study (the South London Stroke Register), which demonstrated that the models are very reliable. This methodology for the derivation of casemix variables was published in the peer-reviewed journal Stroke (Bray BD, Campbell J, Cloud GC, Hoffman A, James M, Tyrrell PJ, Wolfe CD, Rudd AG. Derivation and External Validation of a Case Mix Model for the Standardized Reporting of 30-Day Stroke Mortality Rates. Stroke. 2014; 45: 3374-3380). The models are recalibrated each year, and the ROC statistic, predictiveness curve and coefficients are reported in the technical appendix for each model. The possibility of "residual confounding" from variables not included in the model is raised in the letters sent to alert and alarm teams.
- Detection of Outliers:
Alarms are defined as teams or CCGs/LHBs outside the 99.8% control limit. Alerts are defined as teams or CCGs/LHBs outside the 99% control limit. Byar’s approximation is used to calculate the control limits to determine whether teams or CCGs/LHBs fall outside the alarm or alert levels.
- Presentation of Outliers:
Two different funnel plots are used for comparison between different teams, and between CCGs/LHBs, as they clearly demonstrate how the limits of acceptable performance depend on the number of cases. In addition, public tables of mortality present the data in tabular form, alongside the information on case ascertainment and data completeness.
- Policy for Management of Outliers:
The provisional timescales outlined below may be changed due to issues arising, for example, when public holidays fall, the analytical processes, or if there are no outliers in any one of the categories of alerts or alarms are identified (e.g. at CCG level outliers), or if one stage takes longer (in which case all subsequent timings are adjusted). However, results for trusts will never be put into the public domain any earlier than 8 weeks after team alarms are contacted, in order for teams to discuss the findings with SSNAP, to be supported to understand the data, to conduct mortality reviews, and to investigate potential causes.
- Lead contact for stroke at each hospital approves the data for analysis by locking records by the data deadline. After this deadline, no further changes to the data are possible (see "data accuracy" above, for further explanation of the pre-deadline process)
- Financial year data downloaded from the webtool
- Patient data file sent securely to NHS Digital for linkage with ONS mortality (at least 4 months after data collection to allow time for deaths to be recorded)
- Mortality data analysed using codes for teams, so that the data analyst is unable to identify the names of teams. Identify the number of team outliers (alarms), and their positions on the funnel plot (without knowing who the teams are).
- Identify the number of alert (borderline) teams and their position on the funnel plot.
- Identify the number of CCG/LHB alarm outliers and their position on the CCG/LHB funnel plot. Identify any CCGs/LHBs who are not outliers themselves but whose patients primarily go to alarm teams.
- Identify the names of the alert and alarm teams, and the CCG/LHB alarms and prepare casemix analysis for each in order to aid discussion with the teams or CCGs/LHBs regarding potential reasons for their outlier status
- Send letters by email to the alarm teams’ trust chief executive, medical director and lead clinician for stroke. Letters are sent by the Clinical Director of the Stroke Programme with a view to giving the Trust an opportunity to investigate the findings and to receive support from the Stroke Programme, and informing trust that the CQC will be informed. Letter states that trusts have 10 working days to respond (copying in the CQC), and a further 10 working days to finalise a plan before notifying other parties. The letters offer the stroke team help to put together a plan for service improvement if it does look as if there are issues that need to be addressed which may include taking advantage of the peer review visits offered by the Stroke Programme. The CEO is advised to inform commissioners, NHS Improvement and relevant royal colleges.
- Send letters by email to the alert teams’ trust chief executive, medical director and lead clinician for stroke. Letter states that trusts do not have to formally respond
- Arrange phone calls with alarm and alert teams as required. Chase alarm teams if no response, copying in CQC. If no response is received within 5 working days, the CQC and NHS Improvement are notified of non-compliance.
- Individual team mortality files are uploaded to the webtool for each team. The Stroke Programme informs users by sending a standard email
- CCG/LHB alarm letters and letters informing CCGs/LHBs whose teams are outliers are sent. No formal response required
- Individual CCG/LHB mortality files are uploaded to the webtool for each CCG/LHB. The Stroke Programme informs users by sending a standard email
- The all teams public table of mortality is uploaded for logged in users, and users are informed by a standard email.
- The all CCG/LHB public table of mortality is uploaded for logged in users and users are informed by a standard email.
- Both public tables are phased to public on the SSNAP website, and can then be published on CCG OIS.
- NHSE and CQC will be notified of which teams are outliers, the results of the response and whether or not a response has been provided
7. Who to contact for further information
If you have any further queries about the mortality outlier process, or about SSNAP in general,
please feel free to contact the SSNAP helpdesk on firstname.lastname@example.org
8. National results information
Mortality results for 2016-17 are available here:
SSNAP Outlier Policy 2017 - External Version 1.2