Tag: Reproducible Research

In:  Peer Review  

A post-publication peer review success story

 

In 2016, Dr. Joel Pitt and Prof. Helene Hill published an important paper in ScienceOpen Research. In their paper, they propose new statistical methods to detect scientific fraudulent data. Pitt and Hill demonstrate the use of their method on a single case of suspected fraud. Crucially, in their excellent effort to combat fraud, Pitt and Hill make the raw data on which they tested their method publicly available on the Open Science Framework (OSF). Considering that a single case of scientific fraud can cost institutions and private citizens a huge amount of money, their result is provocative, and it emphasizes how important it is to make the raw data of research papers publicly available.

The Pitt and Hill (2016) article was read and downloaded almost 100 times a day since its publication on ScienceOpen. More importantly, it now has 7 independent post-publication peer reviews and 5 comments. Although this is a single paper in ScienceOpen’s vast index of 28 million research articles (all open to post-publication peer review!), the story of how this article got so much attention is worth re-telling.

Enhanced article-level statistics and context – just one of 28 million on our platform!

Peer review history

The manuscript was submitted and published in January 2016, and the final typeset version of the article was available for download on March 1st. Shortly after this in May 2016, PhD student Chris Hartgerink publicly peer reviewed the article, summarising it as “Interesting research, but in need of substantial rewriting.”

It was after this that the article came to the attention of Prof. Philip B. Stark, an Associate Dean at the University of California, Berkeley, and author of the most highly read article on our platform with over 39,000 views to date!

Prof. Stark runs a course on the theory and application of statistical models. In his course, groups of students replicate and critique the statistical analyses of published research articles using the article’s publicly available raw data. Obviously, for this course to work, Prof. Stark needs rigorous research articles and the raw data used in the article. In this sense, Pitt and Hill’s article on ScienceOpen was an ideal candidate..

The groups of students started their critical replication of the Hill and Pitt article in the Fall semester of 2016 and finished right before the new year.  By getting students to actively engage with research, they gain the confidence and expertise to critically analyse published research.

The Post-Publication Peer Review function on ScienceOpen is usually only open to researchers with more than 5 published articles. This would have normally barred Stark’s groups from publishing their critical replications. However, upon hearing about his amazing initiative, ScienceOpen opened their review function to each of Prof. Stark’s vetted early career researchers. And importantly, since each peer review on ScienceOpen is assigned a CrossRef DOI along with a CC-BY license, after posting their reviews, each member of the group has officially shared their very own scientific publication. 

This also means that each peer review can be easily imported into any user’s ORCID, Publons, and even ImpactStory profiles – the choice is yours!

Public, post-publication peer review works

All of the complete peer reviews from the groups of students can be found below. They all come with highly detailed statistical analyses of the research, and are thorough, constructive, and critical, as we expect an open peer review process to be.

Furthermore, unlike almost every other Post Publication Peer Review function out there, the peer reviews on ScienceOpen are integrated with graphics and plots. This awesome feature was added specifically for Prof. Stark’s course, but note that it is now available for any peer review on ScienceOpen.

Continue reading “A post-publication peer review success story”