Cross-department Benchmarking Data

In autumn 2015 we held a series of regional events across the UK for CLWRota and Medirota. The first event was held in Bristol on 10 September 2015. At these events we presented our latest set of departmental data and benchmarks.

We use the Rota Management Console (RotaMC) to generate the data for our benchmarks. RotaMC is designed to generate reports and statistics for each department and allows fine-grained control over what data is included or included in performance reports.

At every benchmark presentation, each department's name is replaced with a random two character code to maintain the anonymity of the department.

Two department comparison

Our September 2015 data analysis took the form of a comparison between two similarly sized department analysis, department A with 95 anaesthetists (46 consultants, 15 middle grades, 34 trainees) and department B with 103 anaesthetists (59 consultants, 15 middle grades, 29 trainees). However, department A was a split site DGH and department B a single site teaching hospital.

Using the good department checklist we came up with at the March 2015 event we examined the two departments performance on a variety of measures.

The CLWRota good department checklist

Planning the rota:

  • Job plans accurately reflect demand
  • Rota is planned a consistent time ahead

Usage of staff:

  • A consistent number of trainee led and extra sessions
  • Consultants meeting their expected sessions target
  • Trainees receive suitable amount of training time

Setup and publish times

We believe that the most successfully run departments are able to setup and publish their rotas a regular time ahead of the week. The graph below compares the number of days ahead the rota is setup and published at departments A and B. Both departments display a good consistency in their setup and publish times, with one noticeable exception in July 2015 for department A. Department B, the teaching trust, is slightly more consistent but publishes a few days later than department A.

Comparison of setup and publish times

Comparison of setup and publish times

Capacity

A comparison of the capacity information from department A and B, which compares the number of staff who could work on a session and the number of lists needing cover revealed that on average department A could cover 96% of its activity with its available staff but department B could only cover 80%.

Comparison of capacity information

Comparison of capacity information

The significantly lower capacity of department B means that it must make up the lack of cover through either trainee led lists or by using extra paid for sessions. Examination of the department's line graphs clearly indicates that the amount of cover being provided by department B as paid for additional sessions.

Comparison of department line charts

Comparison of department line charts

However, the line graphs also reveal a startling difference in the number of activities being covered per week between the two departments. Department B, is covering an average of 363 activities per week almost 100 more activities than department A, which averages 264 with the vast majority of those extra activities being delivered as an extra session. This is despite the fact that the two Trusts have almost exactly the same number of beds. This can be seen more clearly on the table below:

Comparison of average activities

Comparison of average activities

We can also see difference if we compare the two departments template vs. achieved graphs, which would also show if there is a problem in the planning of department B. The template vs. ahcieved charts show whether the department is delivering their planned number of sessions, based on the templates and also how those sessions are being delivered.

Comparison of template vs. achieved

Comparison of template vs. achieved

The graphs above illustrate that both departments are good at delivering close to their plan and department B is having to meet the plan through a large number of extra sessions where as department A meets their plan with consultant activity and only uses solos or extras to run sessions above the plan.

Our benchmarks provide a comparison for how departments using CLWRota across the UK are performing. The boxplot below shows the demand, indicated by templates, compares to achieved, the total sessions covered, across a range of trusts.

Demand vs. Achieved benchmark

Demand vs. Achieved benchmark

At both department A and B they deliver slightly more sessions than are currently planned which we saw in the template vs. achieved charts and both are slightly higher than the average. Both trusts have a tight variance which is a good indication that they are able to consistently meet the sessions they are planning.

Delivery

Our heatmaps can show the how delivery is spread through the week. For example the heat maps below show the usaga of solo sessions and extra sessions through the week at Departments A and B.

Comparison of solo session heat map

Comparison of solo session heat map

Comparison of extra session heat map

Comparison of extra session heat map

Both departments A nad B use more solo sessions in the afternoons compared to the morning. Department B has the highest number of extra sessions falling on Thursdays and so there may be a case for more people to be job planned to work Thursdays.

The boxplot below show the usage of extra sessions at departments as a percentage of their total activity.

Extra sessiosn benchmark

Extra sessions benchmark

Department A and B are highlighted and it is clear that Department B is using a significantly higher number of extra sessions than A despite the fact that the hospitals are very similar in terms of size. As we saw earlier in the line chart comparison department B is averaging around 100 more sessions a week more activity and the bulk of this activity is being worked as extra.

Looking at the benchmark for solo lists also shows a difference between A and B. Department A sits at about the average of 4.1% of lists being covered as a trainee led list where as B is significantly below the average.

Solo sessiosn benchmark

Solo sessions benchmark

Could B perhaps do more solo lists to cover some of the sessions it is currently using as an extra? To understand that further its worth look at the makeup of trainees. The table below shows how many trainees each of the department has at grades CT1-2, ST3-4 and ST5+ as can be seen the number of senior trainees is very similar.

Trainee numbers at departments A and B

Trainee numbers at departments A and B

Deciding on how many solos a department should be doing is a nuanced issue but hopefully by comparing data with other trusts it will departments find the right number for them.

Notes

For the purposes of these graphs 'normal activity' is defined as sessions provided by a consultant or a middle grade where they are working at normal rate. 'All activity' is defined as normal activity, plus trainee's working on solo sessions, plus extra activity, plus locum activity.

If you would like any further information on how this data was compiled please contact the CLWRota team via support@rotamap.net or call the office on 020 7631 1555.

Please note that, as with all information on this site, all images and information on this site are copyright © Rotamap Ltd.

data