Rapid responses are electronic comments to the editor. They enable our users
to debate issues raised in articles published on bmj.com. A rapid response
is first posted online. If you need the URL (web address) of an individual
response, simply click on the response headline and copy the URL from the
browser window. A proportion of responses will, after editing, be published
online and in the print journal as letters, which are indexed in PubMed.
Rapid responses are not indexed in PubMed and they are not journal articles.
The BMJ reserves the right to remove responses which are being
wilfully misrepresented as published articles or when it is brought to our
attention that a response spreads misinformation.
From March 2022, the word limit for rapid responses will be 600 words not
including references and author details. We will no longer post responses
that exceed this limit.
The word limit for letters selected from posted responses remains 300 words.
Neither of the reasons that Nick Fremmantle cites for sharing of analysis code being bad for science are valid. I agree that replication is important, but sharing of code does not in any way prevent replication - re-analysis wither using the same software or alternative software. The second reason - that the code covers many pages of information - is simply irrelevant. Why is that a reason for not sharing? All the more reason for sharing, as the longer the code, the more likely it is that it contains an error.
It is clearly good practice to make analysis code available for scrutiny. Even the most fastidious of coders can make an error. Even the most respected software can contain bugs. As a reviewer I often try to replicate some of the simpler analyses reported in a manuscript particularly if the result does not make sense to me - not all analysts are as competent as Nick Freemantle! On one occasion I came across one such result. As is good practice the authors had reported the methods in sufficient detail for me to be able to attempt to replicate the finding using the same commands with the same software. I obtained the same findings, which still did not make sense. At this point I decided that there was an error in the software. After much to-ing and fro-ing with the software support team, during which I was accused of not understanding my own data and basic statistics, an error was indeed found in the source code. It was rectified in the next update. While not an error in the analysis code, this story does illustrate the fact that even the best checked code (here commercial software) is prone to error. And, the more scrutiny of code that there is the better.
If it were not obligatory to share the code publicly, it ought, as a minimum, be obligatory to share the code with the editors and reviewers on a confidential basis.
Competing interests:
No competing interests
24 February 2016
Paul D. P. Pharoah
Doctor
University of Cambridge
Strangeways Research Laboratory, Worts Causeway, Cambridge
Re: Author’s reply to Goldacre
Neither of the reasons that Nick Fremmantle cites for sharing of analysis code being bad for science are valid. I agree that replication is important, but sharing of code does not in any way prevent replication - re-analysis wither using the same software or alternative software. The second reason - that the code covers many pages of information - is simply irrelevant. Why is that a reason for not sharing? All the more reason for sharing, as the longer the code, the more likely it is that it contains an error.
It is clearly good practice to make analysis code available for scrutiny. Even the most fastidious of coders can make an error. Even the most respected software can contain bugs. As a reviewer I often try to replicate some of the simpler analyses reported in a manuscript particularly if the result does not make sense to me - not all analysts are as competent as Nick Freemantle! On one occasion I came across one such result. As is good practice the authors had reported the methods in sufficient detail for me to be able to attempt to replicate the finding using the same commands with the same software. I obtained the same findings, which still did not make sense. At this point I decided that there was an error in the software. After much to-ing and fro-ing with the software support team, during which I was accused of not understanding my own data and basic statistics, an error was indeed found in the source code. It was rectified in the next update. While not an error in the analysis code, this story does illustrate the fact that even the best checked code (here commercial software) is prone to error. And, the more scrutiny of code that there is the better.
If it were not obligatory to share the code publicly, it ought, as a minimum, be obligatory to share the code with the editors and reviewers on a confidential basis.
Competing interests: No competing interests