NHS data ‘not accurate enough’ to assess doctors
- 31 October 2006
NHS hospital episode data is not accurate enough to monitor individual doctors’ performance, according to a pilot study conducted by the Royal College of Physicians’ (RCP) iLab.
The pilot allowed doctors to access the Hospital Episode Statistics (HES) in England and the Patient Episode Database Wales (PEDW). Both databases include information such as when a patient is admitted to hospital, what medical condition they have, which consultant they are allocated to, and when they are discharged. Doctors had the expertise of the RCP’s iLab staff to help them interpret the data.
Inaccuracies found in the data were due to a range of factors including: activity being allocated to the wrong consultants; incorrect lengths of stay for inpatients; and failure to collect and record all the relevant data.
iLab project manager, Dr Giles Croft, told E-Health Insider that uncoded activity was a problem for participants in the pilot: “There were large numbers of procedures they knew they were doing, but had not made their way through to the database.”
The iLab researchers concluded that one of the major reasons for the inaccuracies was that doctors were not involved enough in collecting, checking and using these data – the HES and PEDW are entered by other hospital staff and coded by clinical coders.
“We strongly believe that the way forward for is for clinicians to become more involved. Analyses should be provided locally,” Dr Croft told EHI. “In the next phase of our research in Wales we are providing information departments with our methodology for how we ran the data for the consultants and providing them with support. We will be providing clinicians with support materials.
“The problem is information departments having the resources and for it [the data analysis] being enough of a priority. They are so stretched at the moment.”
Encouragingly, doctors were much more likely to want to be involved in clinical data issues once they had been part of the project. The iLab team report that many wanted to improve the accuracy of data held against their name, but said they would need local help to do so.
Asked whether the HES and PEDW were valid for other purposes, Dr Croft said: “As you start looking at a higher level a lot of these problems disappear. Many are to do with the allocation of activity at an individual level.”
The issue of monitoring and reporting on doctors’ individual performances has been debated widely since the high profile inquiry into excess deaths in babies undergoing heart surgery at the Bristol Royal Infirmary between 1988 and 1994.
The question of how to use hospital data to detect ‘outlier’ clinicians whose performance falls below acceptable levels – without wrongly pointing the finger at others – has proved difficult.
In addition, the medical profession now has to undergo appraisal and revalidation which depends in part on accurate clinical data held about them by the institutions where they work.
Dr Croft said the latest report on the issue from the chief medical officer mentioned using HES data to revalidate doctors but he said the RCP thought this would be unwise at the moment.
Some groups of clinicians, notably cardio-thoracic surgeons, have published performance statistics, but Dr Croft pointed out that these came from clinically designed databases in which clinicians had confidence.
This type of recording for individual specialities was, however, expensive and time consuming to set up, he said. The gold standard was a clinically designed and standardised patient record where the information ‘falls out’ as a product of routine record keeping.
The NHS Information Centre, which produces HES welcomed the RCP’s report, but stressed that HES data was originally designed to monitor activity and health trends across the service, and to allocate resources.
An IC statement said: "For this reason they have been most effective when examined at an aggregate, national level and are used effectively by public health observatories, the Healthcare Commission and to support research and clinical audit.
"The research by the RCP has identified that there are some limitations in using HES data, if they are to be used for purposes outside their original scope, for example for the appraisal and revalidation of individual physicians, without supplementary local and supporting evidence."
The IC welcomed the college’s recommendations on training and education of clinicians and said it remained keen to work with the RCP and other professional bodies in the development of new clinical information systems and processes to improve clinically meaningful measures of activity and response.
Related article
Cardiac surgery outcomes website launches