In the last few months at ScienceOpen, we have rolled out an incredible number of new features for our users. Now, we feel it is time to take stock, and reflect on how you are all using them to help enhance your research. We want to recognise some of the valuable work from the global research community in helping to make science more open!
There are now 177 excellent research collections published on ScienceOpen, each with our pretty slick new collection statistics. With this, we want to highlight just a few of the latest collections that have really caught our eye. Here, the collection editors have each done exceptional work in curating and promoting research to create a valuable resource for their communities.
Last week, we unveiled MyScienceOpen, our new professional networking platform for researchers. MySciencOpen comes with a whole cadre of new features that combine the functionality of a range of existing social platforms, and bring them directly to you all in one place. You get access to all of this at the click of a button through our ORCID integration. Free, legal, and easy!
But how can you track this increased impact? Aha, well, we thought of that. Every user at ScienceOpen now has a whole suite of brand new metrics that track how their content is being re-used across ScienceOpen.
Quantifying your context
Everyone can see the tab for your content in context. These show how many total ‘nodes’ or connections your work forms in the context of the ScienceOpen platform. We show:
Your total number of publications
Your total number of different journals published in
Last week, we were pleased to announce the launch of our new professional networking platform, MyScienceOpen.
Fully integrated into our article archive of 32 million article records, and combined with our extensive researcher toolkit, MyScienceOpen is the only research networking platform you will ever need!
Now, this is not just another researcher profile. This is the researcher profile. Why have one profile for your research usage, another for you article records, another for your science communication activities, another to record your peer reviews, another for searching for research, and another for tracking citations and Altmetrics? It’s exhausting!
Making an impact in a research discovery ecosystem
We designed these new features for you to make an increased impact, and keep track as your research progresses. All of this is provided to you within the context of a discovery environment of more than 31 million article records. It just makes sense to have these profile and article enhancement features integrated into an ecosystem where people are actually discovering and re-using research. And for free, of course.
‘Open research’ isn’t just about sharing resources like data, code, and papers, although this is a big part of it. One big, and often under-appreciated aspect of it is about making research accessible, inclusive, and participatory. A major principle driving this is leveraging transparency to bring processes and factors that are currently hidden into public view.
One area of research and scholarly communication where the debate is still very much ongoing for this is for peer review – our system of validation and gatekeeping to the vast archives of public knowledge.
OpenAIRE have released an important new survey and analysis on attitudes and experiences towards ‘Open Peer Review’ (OPR), based on more than 3000 respondents (full data available here to play with). This is important, as OPR is all about the principles above – making the process transparent, collaborative, inclusive, and in the end, better!
Below, we discuss some of the major findings of the survey, and how we at ScienceOpen fit into the bigger picture of Open Peer Review.
The future is Open
The main result of the survey is that the majority (60.3%) of respondents are in favour of OPR becoming a mainstream scholarly practice, particularly regarding open interaction, open reports and final-version commenting. Part of this is due to the relatively lower satisfaction scores reported, with just 56.4% of respondents being satisfied with traditional closed peer review, and 20.6% being dissatisfied – a much lower gap than all previous reports. From the survey, more than three quarters of respondents had previously engaged with OPR either as an author, reviewer, or editor. This suggests that OPR, in one form or another, is already probably more common practice than we might think.
Interestingly, this development is similar to what we saw with other aspects of ‘open science’ such as open access and open data – there is debate, experimentation, variable implementation, and finally they start to become accepted as the norm as policies, practices, and cultures adapt. The survey also showed that 88.2% of respondents were in favour of Open Access to publications, a much higher value than several years ago. It also found that support for OPR is correlated with support for Open Data and Open Access, which is perhaps not surprising, although conversations regarding OPR are still in their relative infancy.
This suggests that as debates around OPR mature, we are likely to see an increase in the uptake and support of it, as with other areas of ‘Open’. Indeed, the survey also found a difference in generational support for OPR, with younger generations favouring it more over more-established researchers. As it is these generations who will inherit and govern the system in the future, it is more likely to have the characteristics that they favour.
Recently, our colleagues at OpenAIRE have published a systematic review of ‘Open Peer Review’ (OPR). As part of this, they defined seven consistent traits of OPR, which we thought sounded like a remarkably good opportunity to help clarify how peer review works at ScienceOpen.
At ScienceOpen, we have over 31 million article records all available for public, post-publication peer review (PPPR), more than 3 million of which are full-text Open Access. This functionality is a response to increasing calls for continuous moderation of the published research literature, a consistent questioning of the functionality of the traditional peer review model (some examples in this post), and an increasing recognition that scientific discourse does not stop at the ‘event’ point of publication for any research article.
At ScienceOpen, we invite the whole scientific community to contribute to the review process, should they wish to. The only requirement is that the person has to be registered at ORCID and have at least five publications assigned to their ORCID account to write a review (Scientific Members and Experts). If you do not satisfy these requirements and wish to perform a peer review at ScienceOpen, please contact us and we will make an exception for you.
Users with at least one publication assigned to their ORCID account are able to comment on a paper (Members). Please refer to our User categories for further details.
We recognise that some times it’s not clear exactly what you’re supposed to do when joining a new research platform. What are the important features, what’s everybody else doing, how do I make my profile as strong as possible? Well, hopefully this will make it easier for you. If you’re still wondering ‘What’s that ScienceOpen thing all about?’, hopefully this will add a bit of clarity too!
Here are the main things you need to know about ScienceOpen:
Get an ORCID account
More than 3 million researchers already have an ORCID account, which acts as both a unique identifier and an integrated profile for them. Registration for it takes 30 seconds, and is now a core part of scholarly infrastructure, with many journals requiring an ORCID profile prior to article submission. Make sure it’s well-populated with all of your published papers, (drawn automatically from Web of Science, Scopus, or CrossRef). Easy!
At ScienceOpen, we have been pushing for greater transparency in peer review since our inception. We inject transparency at multiple levels, by identifying referees, publishing reports, providing formal recognition for contributions, and encouraging open interaction on our platform (more details here).
This is why we’re extremely stoked to see the theme for this year’s Peer Review Week to be all around the theme of transparency!
In 2017, we are helping to organise a session at the Peer Review Congress to help showcase what peer review should look like when it works. We look forward to working with the other partner organisations and the global scholarly community in helping to make peer review a fairer, more transparent process.
How can something exclusive, secretive, and irreproducible be considered to be objective? How can something exclusive, secretive, and irreproducible be considered as a ‘gold standard’ of any sort?
Traditional, closed peer review has these traits, but yet for some reason held in esteem as the most rigorous and objective standard of research and knowledge generation that we have. Peer review fails peer review, and its own test of integrity and validation, and is one of the greatest ironies of the academic world.
What we need is a new standard of peer review that is suitable for a Web-based world of scholarly communication. This is to help accommodate the increasingly rapid communication of research and new sources of information, and bring peer review out of the dark (literally) ages and into one which makes sense in a world of fast, open, digital knowledge dissemination.
What should a standard for peer review look like in 2017?
The big test for peer review, and any future version of it, is how does the scientific community apply its stamp of approval?
At ScienceOpen, we have over 28 million article records all available for public, post-publication peer review (PPPR), 3 million of which are full-text Open Access. This functionality is a response to increasing calls for continuous moderation of the published research literature, a consistent questioning of the functionality of the traditional peer review model (some examples in this post), and an increasing recognition that scientific discourse does not stop at the point of publication for any research article.
In spite of this increasing demand, the uptake of PPPR across different platforms seems to be relatively low overall. So what are some of the main reasons why researchers might feel less motivated to do PPPR, and is there anything we can do to increase its usage and adoption as part of a more open research culture?
What even is ‘post-publication’ peer review?
There is a general mentality among researchers that once research has been published, it has already ‘passed’ peer review, so why should it need to be peer reviewed again?