Feature NHS Organisation

How different are NHS systems across the UK since devolution?

BMJ 2013; 346 doi: http://dx.doi.org/10.1136/bmj.f3066 (Published 14 May 2013) Cite this as: BMJ 2013;346:f3066
  1. Nigel Hawkes, freelance journalist
  1. 1London, UK
  1. Nigel.hawkes1{at}btinternet.com

As NHS systems have moved in different directions, Nigel Hawkes examines the challenges of determining whether one country is doing better than the rest

Almost 14 years after devolution sent the component parts of the United Kingdom off in different directions, it ought to be possible to work out whether the quasi-market of the English NHS has worked any better than the more statist approach retained (or reintroduced) in Scotland, Wales, and Northern Ireland.

There are certainly strong views across the different borders. Seen from a London perspective, the policies of the devolved administrations seem timid and complacent. But the view from Edinburgh, Cardiff, and Belfast is that the English NHS is chasing a chimera of market driven improvement that owes more to doctrine than to evidence.

Neither of these caricatures is wholly true, and real differences in approach between all four systems do exist. So does the evidence emerging indicate whether any one approach is producing better results than another?

The National Audit Office (NAO) and the Nuffield Trust have carried out comparisons recently and, although data kept by each system are not directly comparable in many areas, the reports suggest substantial variations in areas that are important to patients, such as staff to patient ratios, waiting times for inpatient treatment, and patient care.

Scotland has the most staff, the worst health, and the lowest hospital productivity, according to Nuffield’s report,1 while England occupies the other extreme on all these measures. Scotland narrowly outperforms England in the Quality and Outcomes Framework for primary care, as does Northern Ireland. Scotland’s waiting times for inpatient treatment are also slightly better than England’s, according to the NAO, with Scotland claiming that 92% of patients are treated within 18 weeks against 90.5% in England.2 Both reports found that England spends the least money per head of population and Scotland the most. Wales spends roughly as much as England per capita but employs substantially more management and support staff than any English region. (Comparable figures for support staff in Scotland and Northern Ireland do not exist.) Northern Ireland is the only part of the UK where health and social care are combined.

These comparisons all come with the caveat that the failure to introduce comparable data collection systems at the time of devolution makes it impossible to rank the performance of each system definitively. The two most comprehensive reports reach different conclusions in some areas, for this reason.

Waiting times

The findings on waiting times need qualification, as the waiting time expert Rob Findlay has pointed out.3 The NHS in Scotland met the 18 week target when it first came into force there in December 2011 by ignoring some patients who had already passed the 18 week mark. So long as this group represents less than 10% of the total they can in principle be left untreated for ever, since the target demands that 90%, not 100%, are treated within 18 weeks. The incentive is to treat new patients and feed in old ones at a rate of less than 1 in 10, so the 90% target is met; but this is not fast enough to mop up all the long waiters, so their number tends to increase. Exactly the same happened in England, where 18 weeks was achieved four years earlier—but unlike England, Scotland does not collect data for incomplete patient journeys, making the problem harder to spot.

Scotland also has many ways of removing patients from the list, some of which have been abused by health boards to flatter their performance. A quarter of patients on the Scottish waiting lists are declared “unavailable” and the clock is then reset. This can happen if patients change their appointments, refuse to be sent to a hospital outside their area, or find the seven days’ notice of admission too short (in England, it’s three weeks). Scotland has tackled some of these problems in its “12 week treatment time guarantee,” which came into force in October 2012 and covers all patients. But, as Findlay points out, any target requiring 100% compliance must have get-out clauses to allow for exceptional cases, which means there are lots of rules. He argues that the English rules are much more patient friendly and inclusive than the Scottish ones.

What’s his conclusion? “England started targeting referral to treatment waiting times some years before Scotland, so perhaps unsurprisingly England has more complete and reliable data, better and simpler policies and standards, and (probably, because the published Scottish data are difficult to compare) shorter waiting times,” he says.

Wales has set a less demanding target—95% to be treated within 26 weeks—and fails consistently to meet it. In December last year, for example, only 82.9% of patients met this target, leaving 36 000 waiting for longer than 26 weeks. The Welsh government blames “unprecedented emergency care pressures” for the failure, urging local health boards to do better. Professor Marcus Longley, director of the Welsh Institute for Health and Social Care, said: “Wales has rejected the quasi-market approach and there is very little political support for competition between providers. The emphasis seems to be on a combination of exhortation to do better, backed up by performance management by the Welsh government.”

But the failings have given the Westminster government a stick to beat Labour with. In the House of Commons on 23 April, the prime minister, David Cameron, asserted that Labour had been in charge of the NHS in Wales for three years and it hadn’t hit an emergency target during that period. “Last time the urgent care cancer treatment target was met in Wales, anyone? 2008,” he taunted. “Last time A and E [accident and emergency] targets were met? 2009. The Welsh Ambulance Service has missed its call-out target for the last 10 months.”

Waiting time targets are also missed in Northern Ireland, which aims to treat 50% of patients within 13 weeks and 100% within 36 weeks. It meets the first part of the target but misses the second. At the end of 2012, more than 2200 patients had been waiting for more than 36 weeks (4.4%). Waits in emergency departments can also be long—in March, 299 patients waited more than 12 hours in the emergency department at Antrim Area Hospital. Keiran McCarthy, a member of the Stormont Health Committee, said: “Most recent official figures on waiting times for outpatient appointments and in emergency departments are very worrying. It is something that Edwin Poots [Northern Ireland’s health minister] needs to get a handle on.”

Quality of care

The UK Statistics Authority is carrying out a review of waiting time statistics, which is expected to call on the producers to do more to provide data that are comparable across the four countries.4 But comparing quality of care in the four nations is even harder than comparing waiting times. The NAO used emergency admissions per 100 000 people as a proxy for the quality of primary care, on the basis that good primary care should reduce emergency admissions. Using 2009-10 data, it found that Wales fared worst, with 11 471 admissions per 100 000 compared with 9994 in England, 9917 in Scotland, and 8274 in Northern Ireland (table).

Key health facts for UK countries2

View this table:

But the Nuffield Trust argues that a better basis for comparison is not the whole of England but north east England, since its history, population size, and economic situation are more directly comparable with those of the devolved nations. The north east, with 13 030 emergency admissions per 100 000 people, does much worse than Scotland, Wales, or Northern Ireland.

Staff numbers

The admissions figures hint at better primary care in the devolved nations, possibly excluding Wales. The most obvious reason is greater staff numbers. Scotland, where per capita spending on health is highest, had 80 general practitioners per 100 000 people in 2009, against 70 in England and 65 in Wales and Northern Ireland. The three devolved administrations also had considerably more nursing, midwifery, and health visiting staff than England in the same year (1124/100 000 in Scotland against 846 in England, for example.) Survey data show that Scottish GPs are under less pressure, seeing 112 patients per week in 2009 compared with 132 per week in England, 137 in Wales, and 126 in Northern Ireland. Given their greater numbers, this is what you would expect.

Scotland also has higher bed numbers than the other UK countries—500 per 100 000 people in 2008-09, compared with 310 in England, 440 in Wales, and 430 in Northern Ireland (figure). Length of stay is also much lower in England, at 4.3 days against 5.7 in Scotland, 6.3 in Wales, and 5.5 in Northern Ireland. The NAO found that lower lengths of stay were correlated with quality of care, measured by death rates and patient satisfaction scores. England also treats more patients as day cases than Scotland or Wales, but its figures are matched by Northern Ireland.

Figure1

Available hospital beds (excluding day beds) in UK countries, 1999-2000, 2005-06, and 2008-092

Scotland’s large number of hospital beds may, says the NAO, be the result of a higher number of beds for elderly patients than elsewhere in the UK. While bed numbers, length of stay, and proportion of day cases may not unambiguously be markers of quality, they are markers of the efficient use of resources, so England can claim to be the most cost effective of the four in hospital use.

Different philosophies and policies

England’s NHS is not solely reliant on market forces such as the purchaser-provider split and patient choice to generate improvement: it has also relied, perhaps more heavily, on “targets and terror.” Scotland, by contrast, lays emphasis on collaboration. Charles Saunders, deputy chairman of the BMA in Scotland, says: “We operate within a system of collaboration and cooperation, while the English reforms are very much founded on the principle of competition. We do not support this approach. Patients are not products, and the NHS is not a business in the commercial sense of the word.” The only competition that exists in Scotland is between the health boards, which have some freedom to do things differently but within an overall framework set by the Scottish government.

The devolved nations each have distinctive policies that set them apart from England. Scotland’s is public health, where it has used its legislative powers to become the first part of the UK to introduce a ban on smoking in public places, to raise the age for purchasing tobacco from 16 to 18, and to pass a law to set a minimum price for alcohol. Wales has provided free prescriptions and free hospital parking, while Northern Ireland is implementing a major change in health and social care that envisages more care being delivered in the community, less in hospitals.

So is it possible to say which system is doing best? Not in a sentence. England gets the most from the least, so is by some margin the most productive of the four. In times of austerity, that is a substantial bonus. But the extra resources provided to Scotland seem to deliver slightly better primary care through employing greater numbers, so cannot be said to be wasted. Wales is not as well resourced as Scotland or as focused as England—it has a much longer “tail” of management and support staff and gets poorer results on most measures. Northern Ireland provides only limited evidence that integrating health and social care makes for better outcomes.

Anyone who has attempted to make direct comparisons has complained that the data they need are not being collected, are not comparable, or are not reliable. Most recently, the NAO listed five indicators it would “like to have used” to compare the four systems—health outcomes, spending, unit costs per staff member, efficiency and productivity, and quality and effectiveness—none of which worked as well as it might have because of data insufficiencies of one sort or another. The Nuffield Trust report also raised important questions about the availability of comparable data to allow differences to be analysed in future. “Without such comparable data, UK taxpayers and HM Treasury cannot know whether they are securing value for money for their health services,” it concluded.

Such are the caveats that neither Nuffield nor the NAO drew strong conclusions. Nuffield was the bolder, saying that its analysis “suggests that England’s NHS spends less and has fewer staff per capita than the health services in the devolved countries, but that it makes better use of its resources with respect to delivering higher levels of activity and productivity, and lower waiting times.” NAO was more circumspect. “Without a single overarching measure of performance, we cannot draw conclusions about which health service is achieving the best value for money,” it concluded. “Where comparative data are available, we found that no one nation has been consistently more economic, efficient or effective across the indicators we considered.” The Nuffield Trust promises a follow-up report, using newer data, later this year.

Interestingly, a survey of public satisfaction with NHS services by the British Attitudes Survey showed little difference in national attitudes. There was a sharp fall in satisfaction in 2011, which in England might be explained by the row over the coalition’s reforms. But England’s problems did not make the public in Scotland or Wales feel any more satisfied; in fact, their views moved in lockstep with those in England, as they have done consistently since devolution in 1999. So differences in organisation—however important they may seem to economists and policy makers—seem to have relatively little effect on public attitudes. Few, after all, have the chance to make a direct comparison by sampling care the other side of the border.

How the NHS is organised in the four countries of the UK

  • England—Spending in 2012-13 was £108.9bn (€129bn; $168bn). Purchasers (211 clinical commissioning groups, covering an average population of 255 000 each, but ranging from less than 100 000 to almost 1 million) spend £65bn a year buying care from any provider that meets quality standards. Specialist care—worth a further £30bn—is commissioned by NHS England. Patients have choice of hospitals for elective care and hospitals are paid per unit of activity (payment by results), which accounts for half their income; the remainder comes from locally negotiated block contracts. Foundation trusts have been partially liberated from central control

  • Scotland—Spending in 2012-13 was £9.38bn. No separation between purchasers and providers since 2004. Fourteen Health Boards, covering an average population of 370 000 (range 20 000 to 1.2 million) both plan and deliver services. Patients do not have choice of secondary referral but boards do. If they are unable to provide care within the 12 week treatment time guarantee they can send patients to another board. No foundation trusts

  • Wales—Spending in 2012-13 was £5.3bn. No purchaser-provider split since 2009; seven health boards with an average population of 430 000 (135 000–689 000) plan, secure, and deliver care. No patient choice in secondary care; no foundation trusts

  • Northern Ireland—Spending in 2012-23 was £3.9bn. One board covering 1.8 million people and providing both health and social care. It commissions care through five committees, called local commissioning groups. The purchaser-provider split is maintained

Notes

Cite this as: BMJ 2013;346:f3066

Footnotes

  • Competing interests: I have read and understood the BMJ Group policy on declaration of interests and have no relevant interests to declare.

  • Additional reporting by Lisa Campbell in Northern Ireland, Roger Dobson in Wales, and Bryan Christie in Scotland.

  • Provenance and peer review: Commissioned; not externally peer reviewed.

References