Service Reports and Data Analysis
Rotamap provides twice-yearly benchmarks for our client hospital organisations. We review these at our autumn and spring events where we also discuss interesting questions and themes that have emerged from operational data.
In this article:
Service Reports and Data Analysis
Rotamap's department Service Reports are single page documents containing a variety of statistical information about a department's activity drawn from their CLWRota or Medirota instance. At our autumn 2018 event at the Stanley Building in central London, we discussed the ways these documents can be used to analyse department activity with a view to improving performance and reducing variation.
Figure 1. Part of a Rotamap department Service Report
Setup and published weeks
The setup and published weeks chart is used to visualise how far ahead a department is setting up (creating a draft week's rota) and publishing (finalising a week's rota). This shows how regularly a department is commencing planning, and how far ahead. The illustration below is a clear example of a department regularly working towards achieving the '6-4-2' model of theatre planning.
Figure 2. Department A setup and published weeks
The department above, Department A, can be seen to be setting up each week's rota consistently six weeks out. An interesting spike can be seen in the centre of the graph, where the Christmas holiday week was setup nearly eleven or twelve weeks into the future. In addition to this, each week's rota can be seen to be consistently published only a small number of days in advance of that week taking place.
Figure 3. Department B setup and published weeks
The above chart shows a department, Department B, with less stable setup and published characteristics in comparison to Department A. The first interesting aspect of this graph is that for the first six months of data the department set up roughly a month's worth of rotas at a time, shown in the sawtooth-like ridges. After the new year the systematic nature of setting up the rota gives way to a more random pattern.
Department A's setup and publishing pattern follows a more regular pattern which suggests that the rota is well managed. We consider a hallmark of a well planned department to be setting up weeks six to eight weeks in advance and publishing one to two weeks in advance. Earlier publishing allows more visibility of the rota, which can provide useful corrections and readjustments and better coordination with cooperating departments.
Heatmaps show the total session counts aggregated by day of week and session for a reporting period. For CLWRota, Service Reports include heatmaps for solo and extra usage. For Medirota, these show cancellations and also extra usage.
Heatmaps usefully help visualise how evenly a count is spread over the main sessions of a week. An even colouration shows little discrepancy, while "hotspots" show that there is a high count for that day and session. Consequently, heatmaps can show when the department is stressed due to a shortfall in resources or is struggling to meet demand.
Figure 4. Department extra usage heatmap
The heatmap above shows an increased usage of extra (paid for) sessions on Tuesdays and Fridays. Possible reasons for this pattern include a lack of job-planned sessions at these points in the week, a lack of flexible resources, uneven demand on the service, such as inconsistent theatre scheduling, or a combination of the above. Heatmaps are useful for showing unevenness and prompt the observer to more in-depth analysis of the phenomenon, such as checking the system capacity reports.
Service Reports also contain p-charts (also known as "Statistical Control Charts" or "Process Control Charts" &endash; see "The p-control chart: a tool for care improvement" for an introduction to their use in healthcare). These provide a different view of a factor, such as extra session usage, over time and provide an alternative view to heatmaps of variability. A particular quality of p-charts is that they use the intrinsic data to derive control limits, which help explain Shewhart's theory of variation which "...states that quality is inversely proportional to variability". In a system lower variability likely leads to higher quality, as explained in the referenced article.
CLWRota Service Reports presently have p-charts for solo and extra usage, while the reports for Medirota are for cancellations and extra usage.
Figure 5. Department extra usage chart
The above p-chart shows the variation of extra sessions, by week, over the course of a year. The statistical control indicators on the chart show:
- Mean value
- Upper control limit
- Lower control limit
- Data points out of limits shown with red dots
- Runs of 7 data points above or below the mean shown with yellow dots
This particular p-chart shows the effect of seasonal variation bringing the series out of limits in the weeks leading up to, and immediately after, the new year. After that the system reduces variation, and several runs of 7 data points are recorded to the right of the data series, showing the system increasingly coming into control.
The chart above allows periods of instability and stability to be conveniently observed, with a period of instability in the middle of the data set contrasting markedly with improving stability afterwards.
P-charts provide invaluable insights into the stability of a system such as department performance over time when viewed through the lens of factors such as extra sessions, solo sessions or cancellations.
Templated vs Achieved
Templated vs achieved (that is, planned against actual) activity is described in two ways in the Service Reports. The upper graph shows absolute values while the lower graph shows achieved activity as a percentage of planned. The achieved work is then divided into its component parts.
Figure 6. Department templated vs achieved
It can be seen in the upper graph that the department shown above has planned to deliver a very consistent number of sessions per week (aside from some dips occurring over bank holidays and half-terms). We can also see that the department actually delivered many sessions in excess of that.
The lower graph shows how the department achieved the work described above. The blue horizontal line represents 100% of planned activity. The red line shows that normal consultant-led sessions account for most, although not all of the planned activity. A small number of solo sessions – shown in orange – make a contribution. However there is a large number of extra sessions, shown in green, that account for a considerable number of additional sessions, and result in the department delivering many more sessions in excess of their plan.
As the majority of the extra sessions are delivering sessions that were not planned for, it provides a strong indicator that the department should consider revisiting their theatre plans to achieve a closer match between planned and achieved activity. Additionally, and assuming good team productivity rates, techniques might be considered for commuting some of the extra sessions into substantive sessions either from increasing contract PAs, increasing flexible working sessions or taking on more staff.
Rotamap's Service Reports can provide useful insights on various aspects of department performance, such as the evenness of activity during the week (heatmaps), the variation of a factor over time (p-charts) and how it achieves its plan, and how that achievement is composed.
Use of the Service Reports can be an invaluable aid to departments seeking to improve the stability of their service. Better performance, less stress, and a better work-life balance is likely to ensue.
The benchmarking data for this season are below.
If you have any questions about Rotamap's Service Reports or would like to know more about reports for your system please contact the Rotamap support team at firstname.lastname@example.org or 020 7631 1555.