Intended for healthcare professionals

Rapid response to:

Letters Chronic fatigue treatment trial

PACE trial authors’ reply to letter by Kindlon

BMJ 2013; 347 doi: https://doi.org/10.1136/bmj.f5963 (Published 15 October 2013) Cite this as: BMJ 2013;347:f5963

Rapid Response:

Re: PACE trial authors’ reply to letter by Kindlon

PACE trial steering group fail to spot error in reasons for protocol changes

Recent correspondence from White, Kindlon, Lynch and others concerning changes to the PACE trial protocol should be seen within the context of the current debate on the governance of clinical trials. Campaigns such as AllTrials are pushing for the publication of trial protocols, but is it then acceptable to change the protocol and if so how and for what reasons? To justify changes by saying ‘Protocol changes are not that unusual in large trials’, as Lynch [1] does, will not enhance trust in published results. A further consideration is how much stricter should the governance processes be for trials such as PACE that are not blinded meaning changes made even prior to data analysis could be biased.

It may be necessary to change a protocol to take account of new research findings or to change statistical tests to better match the distribution of the data. However, widespread protocol changes, particularly in the way outcomes are measured suggest either a weak protocol design and validation process, or tinkering to improve results. Large trials typically follow smaller exploratory trials and this should reduce the necessity for changes in large trials. PACE was a large trial following similar smaller trials so there are no excuses for needing to change the protocol.

To quickly examine a couple of the many changes: Firstly, White et al [2] change the scoring system used for one of the primary outcomes giving the reason for their change in scoring system as “we changed the original bimodal scoring of the Chalder fatigue questionnaire (range 0–11) to Likert scoring to more sensitively test our hypotheses of effectiveness.” However, they fail to present any evidence or references that would support this assertion or even that the Likert scoring had been validated. The FINE trial ran on similar lines to PACE and reported that a non-significant difference with bimodal scoring became a significant difference using this alternative Likert marking scheme [3].

Secondly, they made an error in their reasoning about changes in the thresholds used to define recovery. White et al justify the change in one threshold for recovery as “We changed our original protocol’s threshold score for being within a normal range on this measure from a score of <85 to a lower score as that threshold would mean that approximately half the general working age population would fall outside the normal range. The mean (S.D.) scores for a demographically representative English adult population were 86.3 (22.5) for males and 81.8 (25.7) for females (Bowling et al. 1999) [5]. We derived a mean (S.D.) score of 84 (24) for the whole sample, giving a normal range of 60 or above for physical function.” [4] The data used for the Bowling survey is available in the national archive [6] and the median for those of working age (less than 65) is 100 with the 25th quartile being 90. This means that the PACE trial authors' assertion is wrong. The mean value White et al use is for the overall population, but even here the median is 95 (that is half the population score 95 or more). Their mistake was to use the mean and standard deviation, which do not properly characterize the distribution of scores.

This leads to the question of how clinical decision makers and patients can know that a steering committee provide effective governance. We do not know what information they were given and considered in approving changes to the protocol. The peer review process for the journals publishing the various PACE trial results has not been effective in ensuring full explanations for the reasons for protocol changes made post publication. It would have been helpful if White et al had released data according to the original protocol alongside their new versions. Such transparency would have gone some way to reduce patients’ concerns and would have reduced the freedom of information requests that White is so keen to complain about.

Transparency of the governance process is necessary to engender trust particularly where changes are made to a protocol. Transparency over the full results set is also advantageous. By keeping parts of the clinical trial data and its governance process secret, researchers are creating unnecessary uncertainty for patients and clinical decision makers around the effectiveness of treatments and associated risks. For many this is unacceptable.

[1] S Lynch (2013), BEYOND PACE - MOVING THE DEBATE FORWARD, BMJ Rapid Response http://www.bmj.com/content/347/bmj.f5963/rr/671093

[2] White PD, Goldsmith KA, Johnson AL, Potts L, Walwyn R, DeCesare JC, Baber HL, Burgess M, Clark LV, Cox DL, Bavinton J, Angus BJ, Murphy G, Murphy M, O’Dowd H, Wilks D, McCrone P, Chalder T, Sharpe M; PACE trial management group (2011). Comparison of adaptive pacing therapy, cognitive behaviour therapy, graded exercise therapy, and specialist medical care for chronic fatigue syndrome (PACE): a randomised trial. Lancet 377, 823–836.

[3] Wearden A.J., Dowrick, C., Chew-Graham, C., Bentall, R.P., Morris, R.K., Peters, S., Riste, L., Richardson, G., Lovell, K., and Dunn, G. (2010) Authors reply, BMJ vol 340(6), doi=10.1136/bmj.c2992

[4] White PD, Goldsmith K, Johnson AL, Chalder T, Sharpe M; PACE Trial Management Group (2013). Recovery from chronic fatigue syndrome after treatments given in the PACE trial. Psychological Medicine Jan 31:1-9.

[5]Bowling A, Bond M, Jenkinson C, Lamping DL (1999). Short Form 36 (SF-36) Health Survey Questionnaire: which normative data should be used? Comparisons between the norms provided by the Omnibus Survey in Britain, the Health Survey for England and the Oxford Healthy Life Survey. Journal of Public Health Medicine 21, 255–270.

[6] Office of Population Censuses and Surveys. Social Survey Division, OPCS Omnibus Survey, November 1992 [computer file]. Colchester, Essex: UK Data Archive [distributor], September 1997. SN: 3660, http://dx.doi.org/10.5255/UKDA-SN-3660-1 Data Accessed 8-3-2013

Competing interests: No competing interests

21 November 2013
Adrian Baldwin
Carer
None
Bristol