Are we doing what we're supposed to be doing?

Introduction

Rotamap provides twice-yearly benchmarks for our client hospital organisations. We review these at our autumn and spring events where we also discuss interesting questions and themes that have emerged from operational data.

In this article:

Are we doing what we're supposed to be doing?

In the autumn of 2016 we held our second series of regional events for CLWRota and Medirota users.

The data presentation was given by Jacob Kirk in Birmingham and Lee Stennett in Edinburgh and focussed on the benefits of using the variety of data available from CLWRota and Medirota including: in-system reports, Service Reports, automated reports and the benchmarks, to discern trends and areas requiring improvement.

As a starting point to use the data we asked a simple question, "Are we achieving what we expect?". This can best examined by looking at templated activity against delivered activity.

Fig 1. Demand vs Actual benchmark.

The above benchmarks show the weekly values for templated activity against delivered activity expressed as a boxplot for each department. They show that anaesthetics departments using our CLWRota software on average achieve 97.2% of their templated activity. For surgical departments using our Medirota software the average is 85.4%. This lower figure is expected as surgical departments often reduce their activity, cancelling clinics over theatre lists, where as it is rare for an anaesthetics department not to cover a list. Whilst it is important to understand the percentage of activity moving from templated to actual each week, the most valuable piece of information is the variance. Those departments with a large variance, expressed as a taller boxplot, are experiencing greater turbulence in the number of sessions being delivered each week. A tighter variance resulting from a more regular number of delivered lists should point to a more tightly run department experiencing less friction.

For a closer look at the delivery against expected within your own department, the Templated vs Achieved line graph at the bottom of the Service Report shows data for 12 months on planned (blue line) and achieved (purple line) sessions as well as how the achieved activity is delivered as consultant/career grade led (red line), trainee led (yellow line) or extra paid for (green line) sessions.

Fig 2. Templated vs Achieved line graph.

The above example Template vs Achieved line graph illustrates that this department is consistently delivering close to their planned activity. However, they're using a large number of extra sessions to fulfil the plan, with little use of trainee led activity.

To better understand the use of extra paid activity or trainee led activity, the solo and extra boxplots show how your department compares to others. Again close attention should be paid to the variance of use.

Fig 3. Extra/locum boxplot.

The highlighted department in the above boxplot of extra session usage has a higher than average extra usage but its variation is much lower than the surrounding departments. The lower variance represents a consistent reliance on extra usage week-to-week, which could be as a result of being under capacity.

Fig 4. Solo boxplot.

The same department is highlighted again on the solo boxplot. The department has the lowest solo usage with incredibly small variation. Only on a few weeks has the usage of solos approached the median, represented by the outlying circles.

Once a department has gained a better understanding of whether they are achieving what is expected it is important to look at how this is being delivered. There are several areas that can affect whether a department meets its expected delivery. These are: job planning, leave taking, staffing level and staffing productivity.

Fig 5. Areas of issue a department may face, and the data sources to analyse them.

By using data available from the system, departments can investigate each of these areas to see what changes could be made to improve their productivity.

Heatmaps can be used to give a clear picture of the stress points throughout the week where job planning could be smoothed out. Heatmaps can be found on the department Service Reports. The below example heatmap shows the usage of extra sessions through the week.

Fig 6. Extra/locum heatmap.

Extra session usage is highest later in the week, specifically on Friday afternoons. This can then be compared with a heatmap for planned leave taken, showing missed sessions due to a person taking leave against templated clinical activity.

Fig 7. Planned leave heatmap.

The planned leave heatmap shows that fewer sessions are missed towards the end of the week. If the sessions missed and extra sessions required line up, it suggests that there are not enough people job planned to work on this session. This can be verified by looking at the capacity report. Capacity reports can be generated from RotaMC in CLWRota.

Fig 8. Capacity report.

This capacity report shows that the department is under capacity at later points in the week, with only 91% capacity on a Friday afternoon. These three corroborating data sources provide strong evidence to move job plans to cover the end of the week or gain additional variable resources that can be used to cover activity at the tail end of the week.

As well as looking at heatmaps and capacity to identify issues with job planning, leave and staffing level, it is also important to look at the productivity levels of staff within the department. Using RotaMC in CLWRota and the Productivity report in Medirota we can determine to what extent a department is delivering its obligated work.

In the department we are looking at above, around 13% of all work is being provided at an extra/locum cost. It is therefore essential to work out if a portion of the 13% is missing from contractual delivery.

Fig 9. Demonstrative RotaMC staffing report.

Using the RotaMC staffing report and averaging the staff's percentage achieved, we find that the department's productivity is 86%. On the face of it this suggests that the department may be adequately resourced to cover the expected target but is having difficulty hitting the target without using extra sessions. The department may wish to investigate productivity in more detail and consider reviewing job plans to better cover the service.

Benchmarks

The benchmarking data for this season are below.

Questions

If you have any questions about the above ideas or would like to know more about how to get reports from your system please contact the Rotamap support team at support@rotamap.net or 020 7631 1555.

data