Calgary: Alberta Health Services (AHS), which released its newly redesigned performance reporting system on Wednesday, says the new system “follows several months of planning and consultation with stakeholders, including clinicians, which led to a decision to streamline previously reported performance measures so they are easier to understand and interpret.”
Easier to understand and interpret? The old system had little coloured symbols for every performance measure; red for bad, green for good, and yellow for can’t tell. How much easier does it get? The problem with the old system wasn’t that it was difficult. It was that it was childish nonsense, a collection of chart junk and junk science violating basic principles of data analytics and reporting.
So what did we get for the months of planning and consultation? Well, the red, green and yellow symbols have been replaced by a coloured line. Not just any coloured line but a very cool looking line with a nicely gradated blue-green colour scheme. Along this line, a black dot represents the performance target for the current year, a vertical red bar represents the target for next year, a yellow diamond presents the national average, and a blue triangle represents the actual level of performance. So, after months of planning, same junk in different package.
Well, not exactly the same. There is much less of it. The old performance reporting system had over 50 performance measures. The new streamlined system has 16. It’s not clear how this provides more information to Albertan’s. Isn’t 16 smaller than 50? It must be the new math. Or maybe it’s just cherry picking statistics that make AHS look good.
Give AHS credit though. The details section behind each performance metric has improved. These back pages actually use the right graphical tools to analyze and present the data. Run charts, basically a line graph, properly presents the data in context and in original time order.
But the data presented on these charts are yearly averages. Averages destroy the informational content in performance measurement data. For example, it’s all very interesting that the average length of stay in emergency for discharged patients is 3.1 hours. It’s also useless. A valid and honest presentation would include the data around that average, especially the maximum wait times people can expect on any given day. The reporting system details the use the right data presentation method, but mangles and corrupts the data before presentation, rendering it both meaningless and misleading.
Averages are even more deceptive when combined with performance targets. For example, the AHS emergency wait time target is three hours. Numbers like this tend to stick in our minds. We start to think that our visit to emergency won’t take longer than three hours. (This is called the anchoring heuristic and we all do it).
In fact, it means one of every two people is waiting longer than the three hour target. It could be one hour longer or 12 hours longer. Nobody knows. The devil is in the details and the details are buried behind the average.
That may be why the new performance reporting system at AHS will only be providing semi-annual averages from here on in. AHS’s “mission is to provide a patient-focused, quality health system that is accessible and sustainable for all Albertans.” How does misleading Albertan’s on healthcare system performance help with that?
Robert Gerst is a Partner in Charge of Operational Excellence and Performance Analytics at Converge Consulting Group Inc. He is author of The Performance Improvement Toolkit and numerous peer-reviewed articles.