To increase the scientific rigor, transparency and reproducibility of the research outputs published, a collaboration with SciCrunch will provide SciScore reports for all manuscript submissions.
The interactive portal Drug Repurposing Central will become a hub for high-quality open access drug repurposing research publications.
Robust data is at the heart of every research article. Increasingly, researchers are making great efforts to make their raw data and software available to other researchers as part of a move to more open and reproducible science. They are carefully managing data generation with new tools and storing digital research data in open data repositories or special subject repositories. But the heterogeneous and sometimes sensitive nature of data raises numerous hurdles. ScienceOpen has taken up the challenge by adding Data Availability Statements to all of its preprint and poster publications and as an additional metadata field for all articles on the site. If your data is open, share it with the world!
“Without data you’re just another person with an opinion.” (W. Edwards Deming)
Continue reading “We love Data! New Data Availability Statements on ScienceOpen”
This Spring, we are organising a little competition for all you researchers! Review an article on ScienceOpen before the end of April, and we will enter you into a prize drawing for an Amazon Kindle Fire tablet.
- Open Peer Review on ScienceOpen
ScienceOpen counts currently more than 40 million articles including 3.7 million open access articles as well as more than 1.4 million preprint articles. All these articles are open on ScienceOpen to a fully transparent review process: open identities, open reports, and open interaction on the platform (see our precedent blogpost here).
At ScienceOpen, we believe that “Open Science” is not just about sharing research data. For us, “Open Science” aims to make research and underlying data accessible in order to inform and allow researchers communities to take part in discussions regarding their field, increasing overall participation and relevant inclusion of different perspectives.
Open peer reviews are also crucial in this current context of rapid development of open science and digital scientific communication. If the openness of scientific contents is a first victory for the advancement of research and innovation, open peer review still needs to be embodied in this practice to establish its full credibility and full benefit. (Picture: CC0 1.0)
- What does reviewing on ScienceOpen bring concretely to reviewers?
→ Reviews are published under Creative Commons Attribution License CC-BY (4.0) and will receive a Digital Object Identifier (DOI) from Crossref. This makes them fully equivalent to any Open Access publication, and they can be cited or integrated further into platforms like Publons, Impactstory, or ORCID.
→ As open access publications indexed on ScienceOpen, reviews are public and can be found easily on the platform using the filter “Content type”: “Review”. For a more precise search, this filter can be used for example in combination with the title of an article.

→ Reviewing articles on ScienceOpen is a great way to show the reviewer’s involvement in his/her research field and his/her appreciation for researchers who have dedicated their time to providing a research resource to their community.
The only requirement to write a review on ScienceOpen is to be registered with ORCID (already done with a ScienceOpen profile) and have at least five publications assigned to the ORCID account (with which you reach ScienceOpen–Expert status). If you do not meet these requirements but would still like to review a paper, contact us.
To enter the drawing, all you need to do is:
→ Log in to ScienceOpen
→ Explore our Content, our Collections
→ Choose any article in your field and click “Review article”.
You can also “Invite someone to review”. This video will help you in getting started.

We look forward to your reviews & will announce the winner on April 30th, 2018!
Good luck!
At ScienceOpen, we have been pushing for greater transparency in peer review since our inception. We inject transparency at multiple levels, by identifying referees, publishing reports, providing formal recognition for contributions, and encouraging open interaction on our platform (more details here).
This is why we’re extremely stoked to see the theme for this year’s Peer Review Week to be all around the theme of transparency!
The idea for the first Peer Review Week, held in 2015, grew out of informal conversations between ORCID, ScienceOpen, PRE (Peer Review Evaluation), Sense About Science, and Wiley, the organizations that planned and launched the initiative in 2015. Last year, ScienceOpen hosted a webinar with Nature, PaperHive, and Publons along the theme of recgnising review.

In 2017, we are helping to organise a session at the Peer Review Congress to help showcase what peer review should look like when it works. We look forward to working with the other partner organisations and the global scholarly community in helping to make peer review a fairer, more transparent process.
Continue reading “Peer Review Week 2017 is all about transparency”
Open peer review has many different aspects, and is not simply about removing anonymity from the process. Open peer review forms part of the ongoing evolution of an open research system, and the transformation of peer review into a more constructive and collaborative process. The ultimate goal of traditional peer review remains the same – to make sure that the work of authors gets published to an acceptable standard of scientific rigour.
There are different levels of bi-directional anonymity throughout the peer review process, including whether or not the referees know who the authors are but not vice versa (single blind review), or whether both parties remain anonymous to each other (double blind review). Open peer review is a relatively new phenomenon (initiated in 1999 by the BMJ) one aspect of which is that the authors and referees names are disclosed to each other. The foundation of open peer review is based on transparency to avoid competition or conflicts born out through the fact that those who are performing peer review will often be the closest competitors to the authors, as they will tend to be the most competent to assess the research.
Continue reading “Peer review: open sesame?”