Making an impact in a research discovery ecosystem
We designed these new features for you to make an increased impact, and keep track as your research progresses. All of this is provided to you within the context of a discovery environment of more than 31 million article records. It just makes sense to have these profile and article enhancement features integrated into an ecosystem where people are actually discovering and re-using research. And for free, of course.
‘Open research’ isn’t just about sharing resources like data, code, and papers, although this is a big part of it. One big, and often under-appreciated aspect of it is about making research accessible, inclusive, and participatory. A major principle driving this is leveraging transparency to bring processes and factors that are currently hidden into public view.
One area of research and scholarly communication where the debate is still very much ongoing for this is for peer review – our system of validation and gatekeeping to the vast archives of public knowledge.
OpenAIRE have released an important new survey and analysis on attitudes and experiences towards ‘Open Peer Review’ (OPR), based on more than 3000 respondents (full data available here to play with). This is important, as OPR is all about the principles above – making the process transparent, collaborative, inclusive, and in the end, better!
Below, we discuss some of the major findings of the survey, and how we at ScienceOpen fit into the bigger picture of Open Peer Review.
The future is Open
The main result of the survey is that the majority (60.3%) of respondents are in favour of OPR becoming a mainstream scholarly practice, particularly regarding open interaction, open reports and final-version commenting. Part of this is due to the relatively lower satisfaction scores reported, with just 56.4% of respondents being satisfied with traditional closed peer review, and 20.6% being dissatisfied – a much lower gap than all previous reports. From the survey, more than three quarters of respondents had previously engaged with OPR either as an author, reviewer, or editor. This suggests that OPR, in one form or another, is already probably more common practice than we might think.
Interestingly, this development is similar to what we saw with other aspects of ‘open science’ such as open access and open data – there is debate, experimentation, variable implementation, and finally they start to become accepted as the norm as policies, practices, and cultures adapt. The survey also showed that 88.2% of respondents were in favour of Open Access to publications, a much higher value than several years ago. It also found that support for OPR is correlated with support for Open Data and Open Access, which is perhaps not surprising, although conversations regarding OPR are still in their relative infancy.
This suggests that as debates around OPR mature, we are likely to see an increase in the uptake and support of it, as with other areas of ‘Open’. Indeed, the survey also found a difference in generational support for OPR, with younger generations favouring it more over more-established researchers. As it is these generations who will inherit and govern the system in the future, it is more likely to have the characteristics that they favour.
Recently, our colleagues at OpenAIRE have published a systematic review of ‘Open Peer Review’ (OPR). As part of this, they defined seven consistent traits of OPR, which we thought sounded like a remarkably good opportunity to help clarify how peer review works at ScienceOpen.
At ScienceOpen, we have over 31 million article records all available for public, post-publication peer review (PPPR), more than 3 million of which are full-text Open Access. This functionality is a response to increasing calls for continuous moderation of the published research literature, a consistent questioning of the functionality of the traditional peer review model (some examples in this post), and an increasing recognition that scientific discourse does not stop at the ‘event’ point of publication for any research article.
At ScienceOpen, we invite the whole scientific community to contribute to the review process, should they wish to. The only requirement is that the person has to be registered at ORCID and have at least five publications assigned to their ORCID account to write a review (Scientific Members and Experts). If you do not satisfy these requirements and wish to perform a peer review at ScienceOpen, please contact us and we will make an exception for you.
Users with at least one publication assigned to their ORCID account are able to comment on a paper (Members). Please refer to our User categories for further details.
At ScienceOpen, we have been pushing for greater transparency in peer review since our inception. We inject transparency at multiple levels, by identifying referees, publishing reports, providing formal recognition for contributions, and encouraging open interaction on our platform (more details here).
This is why we’re extremely stoked to see the theme for this year’s Peer Review Week to be all around the theme of transparency!
In 2017, we are helping to organise a session at the Peer Review Congress to help showcase what peer review should look like when it works. We look forward to working with the other partner organisations and the global scholarly community in helping to make peer review a fairer, more transparent process.
A core concept for our evolving understanding of open research and scholarship is that of equity and fairness within the global research community. At ScienceOpen, this is something we strongly believe in, and work together with a range of publishers and researchers to play our part in making this a reality for research.
As part of our mission, we therefore try to break down barriers in research, and prefer to build bridges over walls. Here are just some examples of how we do this, and in doing so contribute to building a platform that acts as a social community space for all researchers.
One of the main features of ScienceOpen is that we are a research aggregator. We don’t select what we index based on discipline, publisher, or geography, as that just creates another silo. Enough of those exist already. What we need, and what we do, is to bring together research articles from across publishers and other platforms and into one space, where it is all treated in exactly the same way.
When you have articles displayed in this way, factors such as journal brands and impact factors play less importance than the actual content itself. Users can make their own choices about what to read, review, share, and re-use based on their own expertise and evaluation, or the social context provided by our other users.
Last year, SciELO integrated more than 500,000 Open Access articles with us from across Latin America, for the first time putting all of this research on the same level as that from research contained within PubMed Central. There is no reason why there should be geographical segregation of research across platforms. We believe that all research deserves to be read and re-used by anyone, irrespective of where that research was conducted and who published it.
Open Access isn’t just about access to knowledge, but also principles of equality, and to achieve that we have to recognize the value of research from around the world.
How can something exclusive, secretive, and irreproducible be considered to be objective? How can something exclusive, secretive, and irreproducible be considered as a ‘gold standard’ of any sort?
Traditional, closed peer review has these traits, but yet for some reason held in esteem as the most rigorous and objective standard of research and knowledge generation that we have. Peer review fails peer review, and its own test of integrity and validation, and is one of the greatest ironies of the academic world.
What we need is a new standard of peer review that is suitable for a Web-based world of scholarly communication. This is to help accommodate the increasingly rapid communication of research and new sources of information, and bring peer review out of the dark (literally) ages and into one which makes sense in a world of fast, open, digital knowledge dissemination.
What should a standard for peer review look like in 2017?
The big test for peer review, and any future version of it, is how does the scientific community apply its stamp of approval?
At ScienceOpen, we have over 28 million article records all available for public, post-publication peer review (PPPR), 3 million of which are full-text Open Access. This functionality is a response to increasing calls for continuous moderation of the published research literature, a consistent questioning of the functionality of the traditional peer review model (some examples in this post), and an increasing recognition that scientific discourse does not stop at the point of publication for any research article.
In spite of this increasing demand, the uptake of PPPR across different platforms seems to be relatively low overall. So what are some of the main reasons why researchers might feel less motivated to do PPPR, and is there anything we can do to increase its usage and adoption as part of a more open research culture?
What even is ‘post-publication’ peer review?
There is a general mentality among researchers that once research has been published, it has already ‘passed’ peer review, so why should it need to be peer reviewed again?
The Pitt and Hill (2016) article was read and downloaded almost 100 times a day since its publication on ScienceOpen. More importantly, it now has 7 independent post-publication peer reviews and 5 comments. Although this is a single paper in ScienceOpen’s vast index of 28 million research articles (all open to post-publication peer review!), the story of how this article got so much attention is worth re-telling.
Prof. Stark runs a course on the theory and application of statistical models. In his course, groups of students replicate and critique the statistical analyses of published research articles using the article’s publicly available raw data. Obviously, for this course to work, Prof. Stark needs rigorous research articles and the raw data used in the article. In this sense, Pitt and Hill’s article on ScienceOpen was an ideal candidate..
The groups of students started their critical replication of the Hill and Pitt article in the Fall semester of 2016 and finished right before the new year. By getting students to actively engage with research, they gain the confidence and expertise to critically analyse published research.
The Post-Publication Peer Review function on ScienceOpen is usually only open to researchers with more than 5 published articles. This would have normally barred Stark’s groups from publishing their critical replications. However, upon hearing about his amazing initiative, ScienceOpen opened their review function to each of Prof. Stark’s vetted early career researchers. And importantly, since each peer review on ScienceOpen is assigned a CrossRef DOI along with a CC-BY license, after posting their reviews, each member of the group has officially shared their very own scientific publication.
All of the complete peer reviews from the groups of students can be found below. They all come with highly detailed statistical analyses of the research, and are thorough, constructive, and critical, as we expect an open peer review process to be.
Furthermore, unlike almost every other Post Publication Peer Review function out there, the peer reviews on ScienceOpen are integrated with graphics and plots. This awesome feature was added specifically for Prof. Stark’s course, but note that it is now available for any peer review on ScienceOpen.
Kick off the new year with the new unified search on ScienceOpen! We have accomplished a lot over the last year and are looking forward to supporting the academic community in 2017.
In 2016 ScienceOpen brought you more context: Now your search comes with a new analytics bar that breaks down your search results by collections, journals, publishers, disciplines, and keywords for quicker filtering. Try a search for the pressing topics of 2016 like Zika or CRISPR and take the new features for a spin.
Researcher output, journal content, reference lists, citing articles can all be dynamically sorted and explored via Altmetric score, citations, date, activity. Statistics for journals, publishers and authors give overview of the content that we are indexing on ScienceOpen. Check out the most relevant journals on ScienceOpen, for example BMC Infectious Diseases or PloS Genetics for a new perspective. Or add your publications to your ORCID and get a dynamic view of your own output.
In 2016 ScienceOpen brought you more open: The ScienceOpen team participated in and helped organize numerous community events promoting Open Science. From Peer Review Week to OpenCon, talks at SSP in Vancouver and SpotOn in London, our team was on the road, debating hot issues in scholarly communication.
In order to bring more visibility to smaller community open access journals, very often with close to non-existent funding and run on a voluntary basis, we launched our platinum indexing competition. It was geared towards open access journals charging no APCs to their authors. Four successful rounds in, we have selected 18 journals to be indexed and awarded some of them with special featured collections on the ScienceOpen platform. This activity was particularly rewarding as we heard back from journals’ editors expressing their enthusiasm about the ScienceOpen project and enjoying bigger usage numbers on their content.
The ScienceOpen 2.017 version will continue to focus on context, content and open science. We are your starting point for academic discovery and networking. Together let’s explore new ways to support visibility for your publications, promote peer review, improve search and discovery and facilitate collection building. Here is to putting research in context! The year 2016 had some great moments – may 2017 bring many, many more!
For Peer Review Week 2016, we set a simple competition for you all, to publicly peer review one of 25 million research articles on our platform. This fitted perfectly with the theme this year of ‘Recognising Review’, as every single peer review conducted with us is published openly and creditable through the application of a CC BY license, which enables the unrestricted sharing and re-use of the reviews providing that attribution is given.
We’re happy to announce that Lauren Collister was the winner this year, and a t-shirt is on your way!