I have found the papers by Bhatt et al. and Storvik et al. illuminating and applaud their efforts that have assisted in containing the COVID pandemic. I appreciate that a member of this audience—from Brazil—acknowledged using methods from Bhatt et al. for their own work.

I have tried to replicate at least part of the results of Storvik et al. using their own instructions from the paragraph entitled ‘Code and data’, but was unable to get the code to work. While I appreciate that computational techniques themselves can be complex, unless the techniques presented in an article have a realistic chance of being replicated by other researchers, I fear that the underlying science may be inaccessible not only for current researchers but also for posterity. Such inaccessibility tends to undermine the very purpose of publication of research results.

I would be interested to know whether the official reviewers of the article by Storvik et al. (Professor Diggle, perhaps) had any success in replicating at least part of the results. If not, we have an uncomfortable situation where we have to take the authors’ results as given.

In order to test the value of provided code and data sets of the Journal of the RSS Series C, I ran the code supplied by the authors of the first three articles that used R software (which I am familiar with), available on the web page: https://rss.onlinelibrary.wiley.com/hub/journal/14679876/series-c-datasets/69_5. To my disappointment, a barrage of error messages greeted my efforts. Although use of a sample of three would not pass for a study design, I fear that I have stumbled upon a problem pattern involving author-supplied code and data that needs addressing. I believe that the progress of statistics should not be obfuscated or thwarted by researchers’ mathe-magic or computational sleight of hand.

Whereas the reproduction number of the COVID-19 virus has rightly focused our attention, the matter of reproducibility of research studies remains a persistent problem in many areas of science (Baker, 2016). For the future, as a condition for publication, I propose that the RSS should consider requiring that at least the reviewers be able to replicate and vouch for the reproducibility of at least some if not all of researchers’ results. I wish to see the day when authors who genuinely believe in the value of their research would put up a video (say, on YouTube) demonstrating how others could replicate their results.

Acknowledgments

I wish to acknowledge that following a formal request by the editor, just before the article by Storvik et al went to press, one of the paper's coauthors, Dr Palomares arranged a Microsoft Teams call with me. Dr Palomares shared his laptop computer screen and kindly took me through a cutdown version of the simulations and demonstrated the satisfactory working of the R code (available on GitHub) that produced representative simulation results.

Reference

Baker
M.
(
2016
).
1,500 scientists lift the lid on reproducibility
.
Nature
,
533
,
452
454
. https://doi.org/10.1038/533452a

Author notes

Conflicts of interest: None declared.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://dbpia.nl.go.kr/pages/standard-publication-reuse-rights)