Final response from the authors on the integrity of outcome reporting in the REEACT trial
We thank Mr Drysdale and colleagues for their response.
At the risk of an extended correspondence on these issues which may imply we are questioning the value of their work, we would make the following final points.
We accept that our reporting of the change in the primary depression outcome in the BMJ paper could have been better. Whether our error is fairly labelled with the stark term ‘outcome switching’ is open to debate, but we accept that, by rule, we have failed to be entirely transparent and we meet their criteria for such a rating. The BMJ has added additional information for readers (http://www.bmj.com/content/352/bmj.i195) and we have learned a valuable lesson for the future – which we agree is a positive outcome of their project. There was complete transparency in relation to protocol changes in our full report (freely available in the public domain) where we devoted a whole chapter to the subject.
We do not accept their response in terms of the EQ5D.
They state ‘publishing quality of life data in a separate academic paper is in no sense a recognised universal convention’.
We did not suggest it was a universal convention – just that it was ‘not unusual’.
They state that ‘we therefore think that readers should have been told that an extra three secondary outcomes were pre-specified and that there was a plan to report these elsewhere’.
The BMJ report does clearly state that ‘we present the results of a full economic evaluation elsewhere’ (reference 32). That could have been more specific (again, a useful learning point), but it is explicit that the BMJ report is not a complete report of outcomes. A casual reader might be forgiven for not wanting to follow that up in interests of time. To ignore that clear reference to other data when judging reporting of outcomes seems odd.
They state that ‘Expecting all readers to know that they should also be reviewing an additional set of documents - which may or may not exist, for a subset of trials depending on funder, with an unspecified time delay - represents an even more impractical method to ensure transparent outcome reporting’
Mr Drysdale and colleagues paint a compelling picture here. However, that picture is not very convincing in the case of REEACT. The existence of the economic outcomes in another document is clearly communicated (a journal title and a designation of ‘in press’ is the maximum reference detail that we were able to provide at that time). It in no way suggests the kind of complex detective work they imply. A simple email to the corresponding author could have cleared that up very quickly – readers will note that our team responded very quickly to their initial letter.
We note that the COMPARE website refers to outcomes reported in a 'final paper', while their BMJ response refers to a 'primary academic publication'. Does the BMJ paper on clinical outcomes meet the criteria for either? Both terms are ambiguous. If they had asked, we would have recommended they treat the final HTA report (published a matter of weeks after the BMJ) as the definitive source, as it is unrestricted in length and includes the maximum level of detail. Again, an email would have been sufficient to clear this up – hardly ‘impractical’, and a reasonable expectation given the serious concerns they are raising.
Again we note that the HTA report contains a whole chapter detailing the inevitable protocol changes that were made in the course of a five year project working under the NHS research governance framework. Maximum transparency; public domain; open access; no switching of outcomes. We also provided everything that was asked of us by the editors of the BMJ in the pre-publication process. Thus our disquiet with their ‘rush to judgement’ on the basis of the BMJ paper alone remains.
We have great sympathy that the COMPARE process may be onerous. As researchers, we are very aware of the pressures of workload and the need for pragmatic decisions to make projects feasible.
However, REEACT l is still assessed as ‘60% of pre-specified outcomes reported’ (as of 14/1/2016) on a public website ‘tracking switched outcomes in clinical trials’. We think that is a serious charge which is neither entirely fair nor accurate given the information in the BMJ paper and the clarifications about the EQ5D which we have provided. Having endured the more onerous task of delivering the REEACT trial (we calculated over 100,000 person-hours), they can perhaps understand that our sympathy with the time requirements of their system is more limited
We wish the team luck with the remainder of their project.
Competing interests: No competing interests