Intended for healthcare professionals

Rapid response to:

Information In Practice

Performance league tables: the NHS deserves better

BMJ 2002; 324 doi: (Published 12 January 2002) Cite this as: BMJ 2002;324:95

Rapid Response:

Performance League Tables: Towards a Reflective Environment

The paper by Adab et al (1) raises an important topic, but
unfortunately puts a drive for precision ahead of constructive reflection.
It is also based on the false premise that NHS components are homogeneous
parts of a single system. In practice, every facility is a unique mix of
people, facilities, and fabric meeting a local need, and it is the role of
management and clinical leadership to compensate for specific individual
disadvantages whilst maximising strengths. Devlin’s impetus in proving
unacceptable variations in perioperative death rates powerfully showed the
variation in human performance (2).

Statistical identification of outliers for further investigation is a
useful tool, not for diagnosing problems or claiming competitive
advantage, but for indicating issues for further local study. Both
clinical audit (3,4) and benchmarking (5) were developed as tools for this
managerial triage; as soon as such techniques become the basis for praise
or for punitive measures they not only become attributed with a false
degree of intrinsic accuracy, but acceptance and participation are

In focussing on the advancement of alternative methodologies to the
currently published performance measures, Adab and colleagues demonstrate
the diversion of seeking the Holy Grail rather than recognising the
utility and function of available vessels. Visual inspection of the 30
day mortality table which they reproduce as Figure 2 quite clearly shows
five outliers with putatively poor performance, and one star performer.
Their alternative methodology confirms the star and three of the poorly
performing outliers – they have plotted two other near outliers which they
do not identify. Therefore as a means of identifying outliers for further
local situation-specific investigation their method is merely an
alternative with similar results; it eliminates crude ranking but overall
identification of comparative performance is still, rightly, clear.

Addressing the issues of quality of performance and outcome in
healthcare requires a reflective environment and constructive study. The
alchemist’s quest for the golden bullet which will home in unerringly on
worst performance is potentially as diversionary as the political drive to
name and shame, or the statistical impossibility of seeking performance
analysis with no-one in the lower quartile. CEPOD (2), and WHO (Europe)
networks in diabetic care (6), are two strong examples of the benefit
which can be achieved by recognition that quality is best improved by
constructively considering objective analysis of actual outcome, but the
major gains are only achieved by committed clinical and managerial
leadership addressing positively the individual specific circumstances,
triggered by comparatively simple first-level indicative analysis.

A constructive and trusting environment locally and nationally are
essential, and recognition of the kaizan philosophy that many small
improvements have more impact than occasional big demonstrations. Neither
political sledgehammers nor analytic diversions will achieve this, and
statistics can give no more than an indicative filter; local review and
action should be the norm and essence of management.

(1) Adab P, Rouse AM, Mohammed MA, Marshall T. Performance league
Tables: the NHS deserves better. British Medical Journal, 324, 95-98,

(2) Buck N, Devlin HB, Lunn NJ. The Report of a Confidential Enquiry
into Perioperative Death. Nuffield Provincial Hospitals Trust, London,

(3) Standing Medical Advisory Committee. The Quality of Medical
Care. HMSO, London, 1980.

(4) Shaw CD. Medical Audit – A Hospital Handbook. King’s Fund
Centre, London, 1989.

(5) Clinical Management Unit. National Pathology Alliance
Benchmarking Review. Centre for Health Planning and Management, Keele,

(6) World Health Organisation Regional Office for Europe. Quality of
Care and Activities Report 1994/95. World Health Organisation,
Copenhagen, 1996.

Competing interests: No competing interests

13 February 2002
Michael J Rigby
Senior Lecturer
Keele University, Keele, Staffordshire, ST5 5BG
Centre for Health Planning and Management, Darwin Building,