segunda-feira, fevereiro 13, 2012

the making of a myth

A decline in the productivity of the National Health Service (NHS) in England over the past decade has become a widely accepted fact. In March, 2011, Margaret Hodge, chair of the UK parliamentary Public Accounts Committee and a member of the government during the period in question, asserted that “over the last 10 years, the productivity of NHS hospitals [in the UK] has been in almost continuous decline”.1 Not surprisingly, this highly critical assessment was readily supported by the Secretary of State for Health, Andrew Lansley, who claimed there had been “a 15% reduction in productivity” in the previous year.2 Despite such confident statements, rather than declining, the productivity of the NHS has probably improved over the past decade. So how has the myth of declining productivity come about?

The main evidence cited by politicians was published in December, 2010, by the National Audit Office, which claimed that hospital productivity had declined by around 1•4% a year.3 This conclusion relied on an analysis done by the Office for National Statistics (ONS),4 which showed that overall NHS productivity in the UK declined by 0•4% a year since 2000. How was this conclusion reached?

Following international accounting standards, the ONS compared the rise in expenditure between 2000 and 2008 (46•9% overall or 5•9% a year) with the rise in output (44•3% overall or 5•5% a year).4 The validity of such a comparison depends on how output was measured. The ONS based their estimate on the quantity of health care on three categories of activity (number of hospital admissions and out-patient attendances, number of consultations in primary care, and number of prescriptions dispensed), which had risen by 37•7% (4•7% a year). To account for improvements in quality (better outcomes and patient experiences), output was increased by 6•6% (0•8% a year). Clearly, in view of the small difference between expenditure and output (0•4% a year), any shortcomings in the determination of either component could easily affect whether productivity is judged to have fallen or risen. How confident can we be about the analysis?

Despite the Public Accounts Committee describing the ONS report as the “most authoritative national measure” of productivity that exists, they recognised its shortcomings,1 as did the Department of Health,5 National Audit Office,3 ONS,4 and the independent researchers who had done some of the underlying analyses.6 Concerns about estimating the quantity and the quality of outputs existed. Quantity was based on only 80% of activities, with the assumption that other activities increased in a similar fashion. Since these other activities were predominantly community services, which were developing new, more productive clinical pathways during this period, the increase in the quantity of outputs could have been underestimated.

There are three concerns about the way improvement in quality was assessed. First, the relative importance ascribed to each of the three dimensions of quality—safety, effectiveness, and experience7—is unclear (eg, why does patient experience contribute only 2•5% to the overall assessment of quality?). This factor is important because of the differences in the extent to which each dimension improved. Second, the measures of each dimension were very restricted: safety was based on hospital standardised mortality ratios; effectiveness of hospitals was based on estimates of health gain in only 29 of 650 health resource groups, of which 25 were elective surgery; and effectiveness of primary care was limited to the control of blood pressure and cholesterol levels in patients with cardiovascular disease.

Third, much of the data used in the report might not be valid: some data were related to the UK rather than England,8 which outperformed other parts of the UK;9 hospital standardised mortality ratios are not a validated indicator of hospital safety;10, 11 and the assessment of health gain relied on data from privately funded patients who reported substantially less benefit than did NHS patients (for example, a hip replacement was assumed to result in a 68% increase in quality of life whereas recent data on NHS patients showed 153% improvement).6, 12 In view of these concerns, was an enhancement of 0•8% a year to the output of the NHS adequate?
A review of a much wider range of data than was previously available suggests substantial improvements in the quality of health care (webappendix). These data include mortality, case-fatality (or survival) rates, adherence to evidence-based clinical guidelines, patients' experiences, and public satisfaction. Relative improvements year on year can be seen.

Although health care can claim some of the credit for a decline in mortality,13 safer and more effective care has also contributed. Between 2000 and 2009, the relative fall in population mortality was about 2•5% a year, such that a baby born in 2009 could expect to live 3 years longer than one born in 2000. Case fatality rates based on specialist databases provide more direct evidence:14 declines occurred in adult critical care (2•4% a year), dialysis (3•3% a year), coronary artery bypass surgery (4•9% a year), and acute myocardial infarction (5•3% a year).
Other evidence comes from improvements in evidence-based clinical practice: a relative rise of 11•0% a year was reported for use of primary angioplasty after myocardial infarction with ST elevation, prescription of angiotensin converting enzyme (ACE) inhibitors after myocardial infarction rose 2•5% a year, premature discharge from adult intensive care fell 8•7% a year, and adherence to guidelines for patients who had had a stroke increased 5•7% a year.

Patients' experience of how they were treated also improved. There were annual relative increases in the proportion of patients treated within 4 h in accident and emergency departments (2•5% a year), operated on within 28 days of their operation having been cancelled for non-clinical reasons (10•4% a year), admitted to a single-sex ward (11% a year), and receiving a copy of the hospital letter sent to their general practitioner (10•3% a year). Such improvements were mirrored by public views of the NHS: the proportion of people satisfied with how the NHS was being run showed a relative rise of 4•1% a year, and those who felt the service was getting better rose by 10•8% a year.

Although all these data are subject to some uncertainty (eg, some deliberate distortion of data might have occurred) and could overestimate the change in quality that took place, the diversity and size of the annual relative improvements suggest that a 0•8% a year adjustment for improvement in quality is insufficient. Even an additional improvement of 0•5% a year (ie, 1•3% a year improvement in quality) would change the judgment that NHS productivity fell to a verdict that it rose between 2000 and 2009. Therefore, why has a judgment with such an uncertain basis been so widely accepted as a fact?

The source of the myth of declining productivity is not the economists who, in 2005, did the original calculations of health-care productivity for the Department of Health. They recognised the limitations of their estimates, spelt out the assumptions they were forced to make, and highlighted the implications.6 These reservations were understood by the ONS4, 8 and the Department of Health, which recognised that “health gain cannot be measured properly without more information about the outcomes of treatment” and that the adjustment for quality was “far from perfect”.5 Their conclusion was appropriately cautious.

So how did this myth become established? Policy analysis suggests four reasons. The first had its roots in party politics. Despite warnings, estimates suggesting a decline in productivity were seized on by opponents of government policy, fuelled by journalists seeking bad news. Attempts by commentators in the medical press to point out the dangers of misinterpretation had little effect.15, 16 Meanwhile, uncertainty among health services researchers as how best to measure improvement in quality engendered scepticism about any method which, in turn, led to the dismissal of the issue as unimportant. Claims at the time by health ministers that far from declining, productivity was rising,17, 18 were rejected by political opponents, citing the ONS reports as the most accurate estimates of productivity available.17 The opposition's task was made easier by some government back-benchers, at least in private, sharing their perception.19 By 2008, with the prospect of an imminent general election, repeated warnings from experts to treat the estimates cautiously were a forlorn hope.20

Until the election in 2010, debate as to whether productivity had fallen or risen was played out, predictably, largely on party lines. To justify the reforms to the NHS that the Conservative Party wanted to introduce, the claim of declining NHS productivity was necessary. During the election this issue remained a contested area; the myth was not yet established. The second reason for the establishment of the myth was the apparent abandonment by the Labour Party of belief in the policies it had pursued in government since 2000. It was as if, exhausted from years in power and bruised by an election defeat, Labour's will to defend past achievements had gone. The absence of opposition resulted in a consensus that NHS productivity had indeed declined.

The third reason reflects the limitations of parliamentary scrutiny of specialist topics. The Public Accounts Committee, while being the most prestigious of the select committees that scrutinise government, is also the most ambitious because its remit covers all public policies. The Committee has to pronounce on specialist areas in which the members have no expertise so it depends on information provided by their secretariat and selected witnesses. Any expectation that the committee might challenge complex economic models underpinning headline conclusions is fanciful. With regards to NHS productivity, the situation was exacerbated by the lack of rival estimations. This issue might reflect a lack of interest on the topic among academics, their rejection of the feasibility of accurate estimation of health-care productivity, or a sense that other models are not needed because the ONS is seen as authoritative and independent of government.

The fourth reason was the shift in 2010 in the economic environment of public spending from growth to retrenchment. Experiences of improving the productivity of the NHS during a decade with increased expenditure were seen to have little relevance to the new situation of achieving improvement in productivity with a shrinking budget. Since the events of the past decade are perceived as having no current or future relevance, interest in defending past achievements has dissipated.
Declining NHS productivity in England between 2000 and 2009 is just one recent myth in health-care policy. Many other myths have arisen in the past and many more will do so in the future. We cannot prevent myths developing but we should remain vigilant, spot them as early as possible, and attempt to minimise the harm they can do in distorting understanding and misleading policy makers. Meanwhile, development of accurate and competing estimates of health-care productivity are needed that make use of the burgeoning array of high-quality data14 and that translate improvements in quality into generic measures, such as quality-adjusted life-years.

Conflicts of interest
I declare that I have no conflicts of interest.
WebExtra Content

Supplementary webappendix
PDF (95K)References
1 Commons Public Accounts Committee. MPs publish report on NHS hospital productivity. (accessed Dec 20, 2011).

2 House of Commons. Daily Hansard—Debate. (accessed Dec 20, 2011).

3 National Audit Office. Management of NHS hospital productivity. HC491. Session 2010—11.

4 Penaloza M-C, Hardie M, Wild R, Mills K. Public service output, inputs and productivity: healthcare. Cardiff: Office for National Statistics (UK Centre for the Measurement of Government Activity), 2010.

5 Department of Health. Healthcare output and productivity: accounting for quality change. (accessed Dec 20, 2011).

6 Dawson D, Gravelle H, O'Mahony M, et al. Developing new approaches to measuring NHS outputs and productivity. (accessed Dec 20, 2011).

7 Department of Health. High quality care for all. NHS Next Stage Review Final Report. (accessed Dec 20, 2011).

8 Hardie M, Cheers J, Pinder C, Qaiser U. Public service output, inputs and productivity: healthcare. Cardiff: Office for National Statistics, 2011.

9 Connolly S, Bevan G, Mays N. Funding and performance of healthcare systems in the four countries of the UK before and after devolution. London: Nuffield Trust, January 2010. (accessed Dec 20, 2011).

10 Black N. Assessing the quality of hospitals. Hospital standardised mortality rates should be abandoned. BMJ 2010; 340: 933-934. PubMed

11 Shahian DM, Woolf RE, Iezzoni LI, Kirle L, Normand S-LT. Variability in the measurement of hospital-wide mortality rates. N Engl J Med 2010; 363: 2530-2539. CrossRef PubMed

12 NHS. Patient Reported Outcome Measures (PROMS). (accessed Dec 20, 2011).

13 Bunker J. Medicine matters after all. London: Nuffield Trust Publications, 2001.

14 Black N. High-quality clinical databases: breaking down barriers. Lancet 1999; 353: 1205-1206. Full Text PDF(92KB) CrossRef PubMed

15 Berwick DM. Measuring NHS productivity. How much health for the pound, not how many events for the pound. BMJ 2005; 330: 975-976. CrossRef PubMed

16 Black N, Browne J, Cairns J. Health care productivity. Is politically contentious, but can it be measured accurately?. BMJ 2006; 333: 312-313. CrossRef PubMed

17 BBC News. NHS productivity rate ‘falling’. (accessed Dec 20, 2011).

18 Social Market Foundation. Andy Burnham MP launches project looking at the future of the NHS. (accessed Dec 20, 2011).

19 Mullin C. Decline and fall: diaries 2005—2010. London: Profile Books, 2010.

20 BBC News. NHS sees annual productivity fall. (accessed Dec 20, 2011). a Department of Public Health and Policy, London School of Hygiene and Tropical Medicine, London, UK
Correspondence to: Prof Nick Black, Department of Public Health and Policy, London School of Hygiene and Tropical Medicine, 15—17 Tavistock Place, London WC1H 9SH, UK

The Lancet, Early Online Publication, 13 February 2012