News Roundup [abridged Versions Appear In The Paper Journal]

NHS data are not accurate enough for monitoring doctors' performance

BMJ 2006; 333 doi: (Published 02 November 2006) Cite this as: BMJ 2006;333:937
  1. Susan Mayor
  1. London

    Data routinely collected about patients attending NHS hospitals in England and Wales are not suitable for monitoring the performance of individual doctors, says a report by the Royal College of Physicians for the Department of Health in England and the Welsh Assembly Government.

    The research explored the potential use of hospital episode statistics (HES) in England and the patient episode database Wales (PEDW) to support the appraisal and revalidation of consultants.

    These databases include information such as when a patient is admitted to hospital, their medical condition, which consultant they are allocated to, and when they are discharged. In his latest report on revalidation, the chief medical officer suggested that the databases be used as part of the information to assess the performance of doctors (BMJ 2006;333:163).

    The Department of Health in England and the Welsh Assembly Government commissioned the Royal College of Physicians to set up an information laboratory to investigate the quality of the data from a clinical perspective. More than 1300 consultant doctors from a range of specialties were asked their views on HES and PEDW data. They were generally unconfident that the data would accurately reflect their practice. Most of them did not see the locally coded results of their own activities for inpatients or day cases. Eighty per cent reported that they had little or no communication with their hospital information and coding departments.

    The information laboratory then guided 50 consultants through analyses of data held in their names in HES and PEDW, with help from a clinical research fellow and an information analyst. They found discrepancies between the data and what happened in practice. Problems included activity being allocated to the wrong consultants; incorrect lengths of stay for inpatients; and failure to collect and record all the relevant data.

    In some cases, anomalies were traced back to local causes and were subsequently resolved. In other cases, it was thought that the databases—which were originally designed for largely financial and administrative purposes—could not accurately reflect clinical working practices and the complexities of multiprofessional patient care.

    Giles Croft, clinical research fellow at the Royal College of Physicians' information laboratory, said, “Participants felt HES-PEDW data at the individual consultant level to be inaccurate, incomplete, and an unsuitable measure of their clinical practice, especially when comparing it with the practice of others, in particular consultants from outside their own hospital.”

    He suggested that one of the main reasons for this was that doctors were not sufficiently involved in collecting, checking, and using these data—they are entered by other hospital staff.

    On the positive side, 75% of doctors found the data more useful than they had thought, and 94% wanted to help improve data collection after taking part in the project.

    John Williams, director of the health informatics unit at the Royal College of Physicians, said, “Data have been collected and used since the inception of the NHS to monitor the performance of hospitals. They have never been designed to monitor individual physicians, and their unsuitability for this purpose is confirmed by this study.”

    Dr Croft said that structured electronic records for each patient that captured data at the point of care would provide better information to assess doctors' performance.

    The iLab Project Evaluation Report is available at

    View Abstract

    Log in

    Log in through your institution


    * For online subscription