Category: Peer Review

UCL Press Megajournal – What’s next?

The UCL Press announced ambitions for its megajournal project during a town hall event on January 16th 2018 with Dr. Paul Ayris, CEO of UCL Press and Pro-Vice-Provost (Library Services), and Prof. David Price, UCL Vice-Provost (Research), describing the wide-reaching goals and ideals that have moved the university in this undertaking. See the UCL news pages.

University College London

The town hall began the discussion by inviting Robert Kiley of the Wellcome Trust who gave some insight into their successful Wellcome Open Research megajournal, describing the rationale behind the move and how its researchers have taken to it.

UCL Press, having partnered with ScienceOpen to provide a hosting platform for its current eight academic journals, invited Stephanie Dawson, CEO of ScienceOpen, to discuss the further developments and vision toward providing researchers and publishers the infrastructure towards more open and transparent peer review and publication models, with increased search and discoverability.

Catriona MacCallum, previously with the Open Access publisher PLoS and consultant on the first megajournal PLoSONE, and now Director of Open Science with Hindawi, then painted a broad picture of the values, tools and advantages of an open science framework from an individual, institutional and societal perspective.

Ian Caswell, UCL Press Journals Manager, then outlined the aims and ambition of the UCL Press megajournal project to offer researchers and academics the opportunity to publish cross-disciplinary and inter-disciplinary work, characterized by openness.

The next step of the UCL Press megajournal is to begin a campus-wide consultation on the needs and expectations of the UCL community in terms of open peer review and versioning, editorial oversight, topical focus, and technicalities.

Topical focus

A megajournal is by definition of broad scope so as to encourage inter-/cross-disciplinarity and to provide a publishing outlet for content that is not easily categorized. The UCL Press megajournal will begin with a focus on environmental sciences, including contributions from earth sciences, geography, UCL’s medical school, population sciences and UCL Institute of Education. Ultimately, the goal is to provide a platform for the entire university and beyond. Interested UCL researchers outside of these fields should contact UCL Press Journals Manager Ian Caswell about expanding the scope of the platform.

Some topical selection, however, can be very useful for readers in discovering new and related articles in their field. Traditionally, enforcing a narrow definition of scope has been the role of the editor. With this in mind, the ScienceOpen platform opens up the possibility for researchers to create their own topical selection from the whole scholarly corpus. UCL researchers are invited to explore this possibility and create a ScienceOpen “Collection” with the top articles in their fields that can also include articles published in the megajournal or other UCL Press journals. To apply for Collection Editor status contact Stephanie Dawson at ScienceOpen.

The aim of the UCL Press megajournal is to publish sound research, rather than hyped-up results. It aims to welcome research of all kinds like negative or inconclusive results, descriptive papers, protocols, methods or data papers, literature reviews. The focus of the platform will not be on “impact factor”, but rather individual article and author metrics which can be tracked on the platform and used in individualized search and sort mechanisms within the ScienceOpen discovery environment. The consultation and development of the UCL Press megajournal is still on-going and further details will be announced as to its exact aims and scope and submission criteria.

Open and Post Publication Peer Review

By utilising open peer review, we can promote accountable, responsible, and high quality assessment and evaluation of publications. However, what is the purpose and character of “open” and “post-publication” peer review in an open access megajournal? One way to think of it is the tradition of publishing “book reviews” in the social sciences and humanities, which could provide a good model. Other platforms such as Copernicus, F1000 Research or newcomer SciPost have functional systems of review that are closer to the journal peer review model. UCL Press will be consulting with researchers on how the platform can provide the best quality feedback from peers in a constructive way within the technical scope of the platform.

The ScienceOpen platform infrastructure allows for any registered user with an ORCID and “expert” status (5 published articles) to review any paper. The author or any user can also invite reviewers via the platform. Potential reviewers who do not meet these basic criteria can still review an article if the editor decides to give them reviewer status. Because each review receives a DOI and is deposited with the publishing metadata hubs Crossref and ORCID, it is challenging to include anonymous and unaccountable reviews on the platform.


If peer review is conducted transparently and openly, authors must have the possibility of revising their articles and tracking those revisions on the platform. The ScienceOpen platform can provide the infrastructure for this versioning system, however, questions remain on how versioning will inform best publication practice, like should the first submission of an article be regarded as a “preprint” which can be taken down if the community review is very negative or published elsewhere? Alternatively, on the other hand, should it be regarded as a publication from the start with the first version only being retracted in extreme cases? Each policy has its advantages and disadvantages which requires careful discussion towards development into a working model.

Editorial oversight

The level of editorial oversight is another question that all megajournals must decide upon. In the first phase of the megajournal project, it is likely to have a focus to staff and students of UCL, but does not necessarily mean the megajournal will be limited to only UCL authors. As the journal expands its scope and audience to accept articles from beyond the university, it may become necessary to adjust and reassess the review process before publication to prevent poor or fraudulent research from being added to the corpus of published scholarly work.


The UCL Press megajournal will publish all articles open access with a Creative Commons CC BY license. The ScienceOpen platform will require ORCID IDs from all authors and Fundref IDs for funding bodies are encouraged. Open references through CrossRef as part of the I4OC initiative and open data summaries in manuscripts to link to or describe how to access the data underlying the publication, will be available for use for the UCL Press megajournal. All of these technicalities are still under consultation at UCL Press and further announcements will be made on the UCL Press website and social media.


Launching a megajournal for UCL is a project that requires vision and commitment from the university and the community. Your feedback is greatly appreciated. Let’s change the landscape of scholarly communication together!

You will find the slides from the town hall event at DOI:, made available under a CC BY license.

Stephanie Dawson, ScienceOpen and Ian Caswell, UCL Press

The New Year under Review

The New Year under Review

Welcome to 2018! In December we highlighted our topical Collections on ScienceOpen and asked you to review any paper in a collection to enter a drawing for an Amazon Kindle Fire tablet. Today we would like to thank everyone who shared their expertise on ScienceOpen over the last year and are happy to announce the winner: Agustín Estrada Peña of the University of Zaragoza, Spain.

Street Artist SAM3 Image via

Agustín is editor of the collection Ticks and Tick-Borne Pathogens, a comprehensive overview with over 11,000 articles covering the whole spectrum from biology and habitats to molecular mechanisms of disease and epidemiology. The ScienceOpen collection format allows researchers to search within these papers with a wide range of filters and quickly change the top view with sort by date, citations, Altmetric Score, usage and more to drill down and find interesting new work.

Agustín reviewed the paper The global distribution of Crimean-Congo hemorrhagic fever. More reviews of articles in this collection by the tick community are highly welcome. Help to make this an important resource for all! To learn how you can add to this knowledge database check out our resources on reviewing on ScienceOpen. Remember, all reviews are published with a CC BY Open Access license and receive a Crossref DOI. Continue reading “The New Year under Review”  

In:  Peer Review  
Increasing academic support for Open Peer Review

Increasing academic support for Open Peer Review

‘Open research’ isn’t just about sharing resources like data, code, and papers, although this is a big part of it. One big, and often under-appreciated aspect of it is about making research accessible, inclusive, and participatory. A major principle driving this is leveraging transparency to bring processes and factors that are currently hidden into public view.

One area of research and scholarly communication where the debate is still very much ongoing for this is for peer review – our system of validation and gatekeeping to the vast archives of public knowledge.

OpenAIRE have released an important new survey and analysis on attitudes and experiences towards ‘Open Peer Review’ (OPR), based on more than 3000 respondents (full data available here to play with). This is important, as OPR is all about the principles above – making the process transparent, collaborative, inclusive, and in the end, better!

Below, we discuss some of the major findings of the survey, and how we at ScienceOpen fit into the bigger picture of Open Peer Review.

The future is Open

The main result of the survey is that the majority (60.3%) of respondents are in favour of OPR becoming a mainstream scholarly practice, particularly regarding open interaction, open reports and final-version commenting. Part of this is due to the relatively lower satisfaction scores reported, with just 56.4% of respondents being satisfied with traditional closed peer review, and 20.6% being dissatisfied – a much lower gap than all previous reports. From the survey, more than three quarters of respondents had previously engaged with OPR either as an author, reviewer, or editor. This suggests that OPR, in one form or another, is already probably more common practice than we might think.

Interestingly, this development is similar to what we saw with other aspects of ‘open science’ such as open access and open data – there is debate, experimentation, variable implementation, and finally they start to become accepted as the norm as policies, practices, and cultures adapt. The survey also showed that 88.2% of respondents were in favour of Open Access to publications, a much higher value than several years ago. It also found that support for OPR is correlated with support for Open Data and Open Access, which is perhaps not surprising, although conversations regarding OPR are still in their relative infancy.

This suggests that as debates around OPR mature, we are likely to see an increase in the uptake and support of it, as with other areas of ‘Open’. Indeed, the survey also found a difference in generational support for OPR, with younger generations favouring it more over more-established researchers. As it is these generations who will inherit and govern the system in the future, it is more likely to have the characteristics that they favour.

Continue reading “Increasing academic support for Open Peer Review”  

In:  Peer Review  

Defining Open Peer Review at ScienceOpen

Recently, our colleagues at OpenAIRE have published a systematic review of ‘Open Peer Review’ (OPR). As part of this, they defined seven consistent traits of OPR, which we thought sounded like a remarkably good opportunity to help clarify how peer review works at ScienceOpen.

At ScienceOpen, we have over 31 million article records all available for public, post-publication peer review (PPPR), more than 3 million of which are full-text Open Access. This functionality is a response to increasing calls for continuous moderation of the published research literature, a consistent questioning of the functionality of the traditional peer review model (some examples in this post), and an increasing recognition that scientific discourse does not stop at the ‘event’ point of publication for any research article.

Peer Review at ScienceOpen – A brief summary

Open participation

At ScienceOpen, we invite the whole scientific community to contribute to the review process, should they wish to. The only requirement is that the person has to be registered at ORCID and have at least five publications assigned to their ORCID account to write a review (Scientific Members and Experts). If you do not satisfy these requirements and wish to perform a peer review at ScienceOpen, please contact us and we will make an exception for you.

Users with at least one publication assigned to their ORCID account are able to comment on a paper (Members). Please refer to our User categories for further details.

We also encourage users to use our ‘Invite to review’ function (see below), which is available on more than 31 million article records. We know that editorial control will always be a critical aspect of any open peer review system, including PPPR, and therefore encourage collection Editors to solicit peer reviews for articles within their collections.

Continue reading “Defining Open Peer Review at ScienceOpen”  

In:  Peer Review  

Peer Review Week 2017 is all about transparency

At ScienceOpen, we have been pushing for greater transparency in peer review since our inception. We inject transparency at multiple levels, by identifying referees, publishing reports, providing formal recognition for contributions, and encouraging open interaction on our platform (more details here).

This is why we’re extremely stoked to see the theme for this year’s Peer Review Week to be all around the theme of transparency!

The idea for the first Peer Review Week, held in 2015, grew out of informal conversations between ORCID, ScienceOpen, PRE (Peer Review Evaluation), Sense About Science, and Wiley, the organizations that planned and launched the initiative in 2015. Last year, ScienceOpen hosted a webinar with Nature, PaperHive, and Publons along the theme of recgnising review.

In 2017, we are helping to organise a session at the Peer Review Congress to help showcase what peer review should look like when it works. We look forward to working with the other partner organisations and the global scholarly community in helping to make peer review a fairer, more transparent process.

Continue reading “Peer Review Week 2017 is all about transparency”  

In:  Peer Review  

A new gold standard of peer review is needed

How can something exclusive, secretive, and irreproducible be considered to be objective? How can something exclusive, secretive, and irreproducible be considered as a ‘gold standard’ of any sort?

Traditional, closed peer review has these traits, but yet for some reason held in esteem as the most rigorous and objective standard of research and knowledge generation that we have. Peer review fails peer review, and its own test of integrity and validation, and is one of the greatest ironies of the academic world.

What we need is a new standard of peer review that is suitable for a Web-based world of scholarly communication. This is to help accommodate the increasingly rapid communication of research and new sources of information, and bring peer review out of the dark (literally) ages and into one which makes sense in a world of fast, open, digital knowledge dissemination.

What should a standard for peer review look like in 2017?

The big test for peer review, and any future version of it, is how does the scientific community apply its stamp of approval?

Continue reading “A new gold standard of peer review is needed”  

In:  Peer Review  

What are the barriers to post-publication peer review?

At ScienceOpen, we have over 28 million article records all available for public, post-publication peer review (PPPR), 3 million of which are full-text Open Access. This functionality is a response to increasing calls for continuous moderation of the published research literature, a consistent questioning of the functionality of the traditional peer review model (some examples in this post), and an increasing recognition that scientific discourse does not stop at the point of publication for any research article.

Post-publication peer review at ScienceOpen in action!

In spite of this increasing demand, the uptake of PPPR across different platforms seems to be relatively low overall. So what are some of the main reasons why researchers might feel less motivated to do PPPR, and is there anything we can do to increase its usage and adoption as part of a more open research culture?

What even is ‘post-publication’ peer review?

There is a general mentality among researchers that once research has been published, it has already ‘passed’ peer review, so why should it need to be peer reviewed again?

Continue reading “What are the barriers to post-publication peer review?”  

In:  Peer Review  

A post-publication peer review success story


In 2016, Dr. Joel Pitt and Prof. Helene Hill published an important paper in ScienceOpen Research. In their paper, they propose new statistical methods to detect scientific fraudulent data. Pitt and Hill demonstrate the use of their method on a single case of suspected fraud. Crucially, in their excellent effort to combat fraud, Pitt and Hill make the raw data on which they tested their method publicly available on the Open Science Framework (OSF). Considering that a single case of scientific fraud can cost institutions and private citizens a huge amount of money, their result is provocative, and it emphasizes how important it is to make the raw data of research papers publicly available.

The Pitt and Hill (2016) article was read and downloaded almost 100 times a day since its publication on ScienceOpen. More importantly, it now has 7 independent post-publication peer reviews and 5 comments. Although this is a single paper in ScienceOpen’s vast index of 28 million research articles (all open to post-publication peer review!), the story of how this article got so much attention is worth re-telling.

Enhanced article-level statistics and context – just one of 28 million on our platform!

Peer review history

The manuscript was submitted and published in January 2016, and the final typeset version of the article was available for download on March 1st. Shortly after this in May 2016, PhD student Chris Hartgerink publicly peer reviewed the article, summarising it as “Interesting research, but in need of substantial rewriting.”

It was after this that the article came to the attention of Prof. Philip B. Stark, an Associate Dean at the University of California, Berkeley, and author of the most highly read article on our platform with over 39,000 views to date!

Prof. Stark runs a course on the theory and application of statistical models. In his course, groups of students replicate and critique the statistical analyses of published research articles using the article’s publicly available raw data. Obviously, for this course to work, Prof. Stark needs rigorous research articles and the raw data used in the article. In this sense, Pitt and Hill’s article on ScienceOpen was an ideal candidate..

The groups of students started their critical replication of the Hill and Pitt article in the Fall semester of 2016 and finished right before the new year.  By getting students to actively engage with research, they gain the confidence and expertise to critically analyse published research.

The Post-Publication Peer Review function on ScienceOpen is usually only open to researchers with more than 5 published articles. This would have normally barred Stark’s groups from publishing their critical replications. However, upon hearing about his amazing initiative, ScienceOpen opened their review function to each of Prof. Stark’s vetted early career researchers. And importantly, since each peer review on ScienceOpen is assigned a CrossRef DOI along with a CC-BY license, after posting their reviews, each member of the group has officially shared their very own scientific publication. 

This also means that each peer review can be easily imported into any user’s ORCID, Publons, and even ImpactStory profiles – the choice is yours!

Public, post-publication peer review works

All of the complete peer reviews from the groups of students can be found below. They all come with highly detailed statistical analyses of the research, and are thorough, constructive, and critical, as we expect an open peer review process to be.

Furthermore, unlike almost every other Post Publication Peer Review function out there, the peer reviews on ScienceOpen are integrated with graphics and plots. This awesome feature was added specifically for Prof. Stark’s course, but note that it is now available for any peer review on ScienceOpen.

Continue reading “A post-publication peer review success story”  

In:  Peer Review  

The winner of our Peer Review Week 2016 competition

For Peer Review Week 2016, we set a simple competition for you all, to publicly peer review one of 25 million research articles on our platform. This fitted perfectly with the theme this year of ‘Recognising Review’, as every single peer review conducted with us is published openly and creditable through the application of a CC BY license, which enables the unrestricted sharing and re-use of the reviews providing that attribution is given.

We’re happy to announce that Lauren Collister was the winner this year, and a t-shirt is on your way!


Lauren performed a civil, constructive, and detailed peer review of a paper entitled Crowdsourcing Language Change with Smartphone Applications. This article is also part of our Language Change collection, created by George Walkden.


We now have 118 open post-publication peer reviews on our platform. Each one is citable with CrossRef DOIs and can be interlinked with PublonsORCID, and ImpactStory, helping to build you profile as a researcher. This is a practical example that this form of peer review works!

In:  Peer Review  

Recognition for Review is focus for Peer Review Week 2016

To honor and celebrate peer review, a group of organizations is working collaboratively to plan a week of activities and events. The group is delighted to announce that the second annual Peer Review Week will run from September 19- 25, 2016.


This year’s theme is Recognition for Review, exploring all aspects of how those participating in review activity – in publishing, grant review, conference submissions, promotion and tenure, and more – should be recognized for their contribution.

Continue reading “Recognition for Review is focus for Peer Review Week 2016”  

  Previous page