Blog
About

Author: Jon Tennant

In:  Peer Review  

Pre- or post-publication peer review

Traditional models of peer review occur pre-publication by selected referees and are mediated by an Editor or Editorial Board. This model has been adopted by the vast majority of journals, and acts as the filter system to decide what is considered to be worthy of publication. In this traditional pre-publication model, the majority of reviews are discarded as soon as research articles become published, and all of the insight, context, and evaluation they contain are lost from the scientific record.

Several publishers and journals are now taking a more adventurous exploration of peer review that occurs subsequent to publication. The principle here is that all research deserves the opportunity to be published, and the filtering through peer review occurs subsequent to the actual communication of research articles. Numerous venues now provide inbuilt systems for post-publication peer review, including ScienceOpen, RIO, The Winnower, and F1000 Research. In addition to those adopted by journals, there are other post-publication annotation and commenting services such as hypothes.is and PubPeer that are independent of any specific journal or publisher and operate across platforms.

A potential future system (source)
A potential future system of peer review (source)

Continue reading “Pre- or post-publication peer review”  

In:  Peer Review  

Should peer review reports be published

One main aspect of open peer review is that referee reports are made publicly available after the peer review process. The theory underlying this is that peer review becomes a supportive and collaborative process, viewed more as an ongoing dialogue between groups of scientists to progressively asses the quality of research. Furthermore, it opens up the reviews themselves to analysis and inspection, which adds an additional layer of quality control into the review process.

This co-operative and interactive mode of peer review, whereby it is treated as a conversation rather than a selection system, has been shown to be highly beneficial to researchers and authors. A study in 2011 found that when an open review system was implemented, it led to increasing co-operation between referees and authors as well as an increase in the accuracy of reviews and overall decrease of errors throughout the review process. Ultimately, it is this process which decides whether research is suitable or ready for publication. A recent study has even shown that the transparency of the peer review process can be used to predict the quality of published research. As far as we are aware, there are almost no drawbacks, documented or otherwise, to making referee reports openly available. What we gain by publishing reviews is the time, effort, knowledge exchange, and context of an enormous amount of currently secretive and largely wasted dialogue, which could also save around 15 million hours per year of otherwise lost work by researchers.

 

Source
Source

Continue reading “Should peer review reports be published”  

In:  Peer Review  

Peer review: open sesame?

Open peer review has many different aspects, and is not simply about removing anonymity from the process. Open peer review forms part of the ongoing evolution of an open research system, and the transformation of peer review into a more constructive and collaborative process. The ultimate goal of traditional peer review remains the same – to make sure that the work of authors gets published to an acceptable standard of scientific rigour.

There are different levels of bi-directional anonymity throughout the peer review process, including whether or not the referees know who the authors are but not vice versa (single blind review), or whether both parties remain anonymous to each other (double blind review). Open peer review is a relatively new phenomenon (initiated in 1999 by the BMJ) one aspect of which is that the authors and referees names are disclosed to each other. The foundation of open peer review is based on transparency to avoid competition or conflicts born out through the fact that those who are performing peer review will often be the closest competitors to the authors, as they will tend to be the most competent to assess the research.

Continue reading “Peer review: open sesame?”  

In:  Other  

Research on Zika virus free to publish via ScienceOpen

Two days ago, the World Health Organisation declared that the threat of the Zika virus disease in Latin America and the Caribbean constituted a Public Health Emergency of International Concern.

The decision was based on the outbreak of clusters of microcephaly and Guillian-Barré syndrome, which are devastating cases of congenital malformation and neurological complications. While a direct causal relationship has yet to be formally stated, the correlation between Zika infection during pregnancy and microcephaly is strongly correlated.

At ScienceOpen, we believe that rapid publication serves the communication of research, and aim to have submitted papers published online within 24-48 hours. For articles relating to the Zika outbreak, we are waiving the usual submission charge, and any published articles will be integrated into our pre-existing research collection on the Zika virus. Articles will receive top priority, and therefore be almost immediately available to the research community, medical professionals, and the wider public. We encourage submission of all articles relating to the virus. Please directly contact Stephanie Dawson for submissions and related enquiries.

Aedis aegypti, one of the culprit mosquitoes. Image: James Gathany, CC BY
Aedis aegypti, one of the culprit mosquitoes. Image: James Gathany, CC BY

There is clearly a need to co-ordinate international efforts, including those of the research community, to investigate and understand the Zika virus better. At ScienceOpen, we want to play our part in facilitating the communication of any such research, and the speedy protection of those at risk. We are happy to join other open access publishers such as F1000 Research and PLOS Current Outbreaks (both of which which publish very rapidly) who have similarly declared that all research published with them on the Zika virus can be published free of charge.

In:  Peer Review  

Credit given where credit is due

For the majority of scientists, peer review is seen as integral to, and a fundamental part of, their job as a researcher. To be invited to review a research article is perceived as a great honour due to its recognition of expertise, and forms part of the duty of a scientist to help progress research. However, the system is in a bit of a fix. With more and more being published every year and ever increasing demands on the time and funds of researchers, the ability to competently perform peer review is dwindling simply due to competition with other aspects of duty. Why, many researchers might ask, should they spend their valuable time reviewing others work for little to no recognition or reward, as is with the traditional model? Indeed, many publishers opine that the greatest value they add is through managing the peer review process, which in many cases is performed on a volunteer basis by academic Editors and referees, and estimated to cost around $1.9 billion in management per year. But who actually gets the recognition and credit for all of this work?

Continue reading “Credit given where credit is due”  

In:  Peer Review  

Advances in peer review

It’s not too hard to see that the practices of and attitudes towards ‘open science’ are evolving amidst an ongoing examination about what the modern scholarly system should look like. While we might be more familiar with the ongoing debate about how to best implement open access to research articles and to the data behind publications, discussions regarding the structure, management, and process of peer review are perhaps more nuanced, but arguably of equal or greater significance.

Peer review is of enormous importance for managing the content of the published scientific record and the careers of the scientists who produce it. It is perceived as the golden standard of scholarly publishing, and for many determines whether or not research can be viewed as scientifically valid. Accordingly, peer review is a vital component at the core of the process of research communication, with repercussions for the very structure of academia which largely operates through a publication-based reward and incentive system.

Continue reading “Advances in peer review”  

In:  Altmetrics  

The relationship between journal rejections and their impact factors

Frontiers recently published a fascinating article about the relationship between the impact factors (IF) and rejection rates from a range of journals. It was a neat little study designed around the perception that many publishers have that in order to generate high citation counts for their journals, they must be highly selective and only publish the ‘highest quality’ work.

Apart from issues involved with what can be seen as wasting time and money in rejecting perfectly good research, this apparent relationship has important implications for researchers. They will tend to often submit to higher impact (and therefore apparently more selective) journals in the hope that this confers some sort of prestige on their work, rather than letting their research speak for itself. Upon the relatively high likelihood of rejection, submissions will then continue down the ‘impact ladder’ until a more receptive venue is finally obtained for their research.

Continue reading “The relationship between journal rejections and their impact factors”  

In:  Peer Review  

The Peer Reviewers Openness Initiative

Openness in scholarly communication takes many forms. One of the most commonly debated in academic spheres is undoubtedly open access – the free, equal, and unrestricted access to research papers. As well as open access, there are also great pushes being made in the realms of open data and open metrics. Together, these all come under an umbrella of ‘open research’.

One important aspect of open research is peer review. At ScienceOpen, we advocate maximum transparency in the peer review process, based on the concept that research should be an open dialogue and not locked away in the dark. We have two main peer review initiatives for our content: peer review by endorsement, and post-publication peer review.

On the complex web that is Open Research. Source.
On the complex web that is Open Research. Source.

A new project has been launched recently, the Peer Reviewers Openness Initiative (PROI). Similarly to ScienceOpen, is grounded in the belief that openness and transparency are core values of science. The core of the initiative is to encourage reviewers of research papers to make open practices a pre-condition for a more comprehensive review process. You can read more about the Initiative here in a paper (open access, obviously) published via the Royal Society.

Central to the PROI are five simple guidelines:

  1. Data should be made publicly available.All data needed for evaluation and reproduction of the published research should be made publicly available, online, hosted by a reliable third party. [I’m an author; help me comply!]
  2. Stimuli and materials should be made publicly available.Stimulus materials, experimental instructions and programs, survey questions, and other similar materials should be made publicly available, hosted by a reliable third party. [I’m an author; help me comply!]
  3. In case some data or materials are not open, clear reasons (e.g., legal, ethical constraints, or severe impracticality) should be given why. These reasons should be outlined in the manuscript.[I’m an author; help me comply!]
  4. Documents containing details for interpreting any files or code, and how to compile and run any software programs should be made available with the above items.In addition, licensing or other restrictions on their use should be made clear. [I’m an author; help me comply!]
  5. The location of all of these files should be advertised in the manuscript, and all files should be hosted by a reliable third party.The choice of online file hosting should be made to maximize the probability that the files will be accessible for many years, and to minimize the probability that they will be lost for trivial reasons (e.g., accidental deletions, moving files). [I’m an author; help me comply!]

Stephanie Dawson, CEO of ScienceOpen, and Jon Tennant, Communications Director, have signed the PROI, both on behalf of ScienceOpen and independently, respectively, joining more than 200 other researchers to date. Joining only takes a few seconds of your time, and would help to solidify a real commitment to making the peer review process more transparent, and helping to realise the wider goal of an open research environment.

In:  Research  

New ScienceOpen study on the effectiveness of student evaluations of teaching highlights gender bias against female instructors

Student evaluations in teaching form a core part of our education system. However, there is little evidence to demonstrate that they are effective, or even work as they’re supposed to. This is despite such rating systems being used, studied and debated for almost a century.

A new analysis published in ScienceOpen Research offers evidence against the reliability of student evaluations in teaching, particularly as a measure of teaching effectiveness and for tenure or promotion decisions. In addition, the new study identified a bias against female instructors.

The new study by Anne Boring, Kellie Ottoboni, and Philip Stark (ScienceOpen Board Member) has already been picked up by several major news outlets including Inside Higher Education and Pacific Standard. This gives it an altmetric score of 54 (at the time of writing), which is the highest for any ScienceOpen Research paper to date!

Continue reading “New ScienceOpen study on the effectiveness of student evaluations of teaching highlights gender bias against female instructors”  

In:  Announcements  

ScienceOpen joins ORCID initiative

ORCID (Open Researcher and Contributor ID) is a community-based effort to provide a registry of unique and persistent researcher identifiers, and through this links to research activities and outputs. It is a powerful tool for both researchers and institutions, and can be easily integrated with CrossRef, PubMed Central, Scopus, and other data archives to populate researcher records.

At ScienceOpen, we have always supported the use of ORCID within our services. Membership at ScienceOpen can be updated directly using your ORCID profile, providing seamless integration of the two. To comment, review and rate articles, we require an ORCID along with membership at ScienceOpen. If you have more than 5 articles within your ORCID profile, you’ll gain Expert member status with us, and free reign of services!

Continue reading “ScienceOpen joins ORCID initiative”