Cardiac care at Queen Hospital Birmingham: have we learned since Bristol?
Published: 09 Mar 2016
The CQC has yesterday (8 March 2016) published its report on the heart surgery unit at the Queen Elizabeth Hospital Birmingham. It makes dire reading. I believe it is appropriate to reflect on this matter and what all of us in healthcare might do to avoid this happening again. We are all part of the solution.
At HQIP we commission the national clinical audit programme, in many areas of clinical care. In the case of adult cardiac surgery, the organisation commissioned to run the audit is the National Institute for Cardiovascular Outcomes Research (NICOR) with the full cooperation of the professional body, the Society of Cardiothoracic Surgeons (SCTS).
The role of organisations such as NICOR is to gather data, work with hospitals and clinicians to verify results to then allow them to measure outcomes and improve the care they offer. Such measures are very useful for clinicians and hospitals that can compare the results of their services and can be used to assure patients and commissioners that all is well. As such it’s no surprise the majority of hospitals and clinicians are passionate about their results, and that makes our job easy to do as we link with a very energetic and empowered group of clinicians.
In many respects this is a result of the long journey we have been on following from Bristol where we can now say ‘clinicians are open about clinical performance, allowing patients to access information about the relative performance of hospitals, services and consultants’. So the answer to the question as to whether all of us concerned with better healthcare have learned from Bristol would likely be ‘Yes, but must try harder’. Further, in relation to adult cardiac surgery, over the many years it has been going, it has shown improving results in an environment where patients are of increasing complexity. While this improvement is stimulated and monitored by the audit, the focus of consultants on outcomes has been an important part of the improvement.
It has come as a great shock that we in the clinical audit community have had such a kick back from Birmingham. There were two adult cardiac surgery hospitals coming in as three standard deviation (‘red’) outliers. The other one understood that there was a problem and worked with NICOR and SCTS, with an important action plan, so that current preliminary results show great improvement. That is what we would expect. These are audits to allow clinicians to look at their results and improve them when they show a problem, or where there might be a problem. This is national audit working at its best.
The alternative and more dangerous approach is what we have seen in Birmingham. The reason why it is dangerous is that if the data is right but denied, then things are not rectified and patients might die. In any situation when the data suggests that there may be a problem the safest way to manage that situation is to assume that the data is correct and, while reviewing the data to see if the data is correct or not, to set in train an improvement programme to rectify the situation. If the data does subsequently turn out to be correct nothing has been lost.
When hospitals deny (and continue to deny) there is a problem, challenging the data, the cases included or the analytical methodology despite multiple data validation rounds and re-analysis, the reality is that this delays improvement planning with consequences for safety of patient care. Patients treated in private hospitals come under different governance frameworks which will impact on care, so should not be considered the same.
One other failure in the management of the data is that it clearly illustrates that the information flows were not real time ones. While clinical audits are fantastically powerful in providing detailed local comparative results against a national picture, hospitals submit data in different ways and time frames to NICOR. But one expects modern hospitals to be monitoring and reacting to their own data on an ongoing basis, so clinicians and managers see how patient outcomes are progressing real time. The advantage of this is that deviation off a norm can be picked up early and rectified before harm is done. If one waits for the annual submission to work its way through it can take 18 months to pick up a problem that, one would argue, is too late. To also note however, many national clinical audits have already moved to real-time reporting and HQIP is supporting moves to broaden that.
In relation to learning there are areas that we must, with others, work on. We are bringing together a methodology/statistical group to advise the national audits so that we can be more responsive to challenge in the future. We shall also review current methodologies so that the audits can be placed in a position where challenge is unlikely. We will disseminate best practice in having all audits, which are suitable, to be auditing real time in their units.
We will discuss with NHS England and CQC whether the process for escalating the outcomes was responsive enough and whether these negative results should have been publicised more widely than was the case where they were openly displayed on the SCTS and MyNHS websites.
Most of all, all of us concerned with getting better care for patients, must reflect seriously on the challenge from the Guardian (4 March 2016) that ‘the health service continues to harbour some dangerously defensive instincts’. This will require cultural change as culture did play a role in this affair. The vast majority of hospitals and clinicians are not in this category and all of our national audits serve to prove this but still we must all work harder to ensure that we move this from the vast majority to all.
Professor Danny Keenan, medical director, HQIP
Find me on Twitter: @DannyHQIP