Thursday 7 October 2010

Sometimes we get it right – but we don’t make it easy

At the cost of sounding like a health information geek, I have to say it’s been fascinating to get to know the Sixth National Adult Cardiac Surgical Database Report 2008.
What makes it so interesting is that it’s clearly designed to deliver to clinicians exactly the information they need to be able to compare their own performance in cardiac surgery against national benchmarks. The report, one of several produced by e-Dendrite Clinical Systems Ltd on behalf of different clinical associations and based on one of the many data registries they hold, shows indicators defined by clinicians and calculated to their specifications.

This is diametrically opposed to the approach adopted by the national programmes that have dominated health informatics in England over the last ten years, apparently about to vanish without trace and without mourners. They based themselves on datasets that used at one time to be called ‘minimum’. The word has been dropped but it still applies: these dataset represent the least amount of data that a hospital can sensibly be expected to collect without putting itself to any particular trouble. Essentially, this means an extract from a hospital PAS alone, with none of the work on linkage between different data sources that I’ve discussed before.

The indicators that can be produced from such minimal information are necessarily limited. Usually we can get little more than length of stay, readmissions and in-hospital mortality. We’ve already seen how misleading the latter can be when I talked about the lurid headlines generated over Mid Staffordshire Trust.

The contrast with the Cardiac Surgery Database could hardly be more striking.

Clinicians have defined what data they need, and have made sure that they see just that. If it’s not contained in an existing hospital system, they collect it specifically. The base data collection form shown in the report covers six pages. There are several other multi-page forms for specific procedures.

An automatic feed from Patient Administration System can provide some of the patient demographic data, but apparently in most contributing hospitals, the feed is minimal. All the other data has to be entered by hand. It must be massively labour-intensive, but clinicians ensure it’s carried out because they know the results are going to be useful to them.

An example of the kind of analysis they get is provided by the graph below, showing survival rates after combined aortic and mitral valve surgery, up to five years (or, more precisely, 1825 days) following the operation. What’s most striking about this indicator is that it requires just the kind of data linkage that we ought to be carrying out routinely, and this case with records from outside the hospitals: patient details are being linked to mortality figures from the Office of National Statistics, meaning that we’re looking at deaths long after discharge and not just the highly limited values for in-hospital deaths that were used in the press coverage about Mid Staffordshire.



Less obvious but at least as significant is the fact that the figures have been adjusted for risk – and not using some general rule for all patients, but on a series of risk factors relevant to cardiac surgery: smoking, history of diabetes, history of hypertension, history of renal disease, to mention just a few.

Looking at the list of data collected, it’s clear that more automatic support could be provided. For instance, it should be possible to provide information about previous cardiac interventions or investigations, at least for work carried out in the hospital. Obviously, this would depend on the hospital collecting the data correctly, but a failure in data collection is surely something to be fixed rather than an excuse for not providing the information needed.

It is unlikely that the hospital could provide the information if the intervention took place at another Trust, so cardiac surgery staff would still have to ask the question and might have to input some of the data themselves. Automatically providing whatever data is available would, however, still represent a significant saving of effort.

The converse would also be invaluable: if cardiac surgery staff are adding further data, surely it should be uploaded into the central hospital database or data warehouse? That would make it available for other local reporting. It seems wasteful to collect the data for one purpose and not make it available for others.

Of course, all this depends on linkage between data records. It’s becoming a recurring refrain in these posts: linkage is something that should be a key task for all Trust information departments. What we have here is another powerful reason why it needs to be done systematically.

And while we’re thinking about data linkages, let’s keep reminding ourselves that this Report uses links to ONS mortality data. Doing that for hospital records generally would provide far more useful morality indicators. So what’s stopping us doing it?