One main aspect of open peer review is that referee reports are made publicly available after the peer review process. The theory underlying this is that peer review becomes a supportive and collaborative process, viewed more as an ongoing dialogue between groups of scientists to progressively asses the quality of research. Furthermore, it opens up the reviews themselves to analysis and inspection, which adds an additional layer of quality control into the review process.
This co-operative and interactive mode of peer review, whereby it is treated as a conversation rather than a selection system, has been shown to be highly beneficial to researchers and authors. A study in 2011 found that when an open review system was implemented, it led to increasing co-operation between referees and authors as well as an increase in the accuracy of reviews and overall decrease of errors throughout the review process. Ultimately, it is this process which decides whether research is suitable or ready for publication. A recent study has even shown that the transparency of the peer review process can be used to predict the quality of published research. As far as we are aware, there are almost no drawbacks, documented or otherwise, to making referee reports openly available. What we gain by publishing reviews is the time, effort, knowledge exchange, and context of an enormous amount of currently secretive and largely wasted dialogue, which could also save around 15 million hours per year of otherwise lost work by researchers.
ORCID (Open Researcher and Contributor ID) is a community-based effort to provide a registry of unique and persistent researcher identifiers, and through this links to research activities and outputs. It is a powerful tool for both researchers and institutions, and can be easily integrated with CrossRef, PubMed Central, Scopus, and other data archives to populate researcher records.
Most of us, whether we are researchers or not, can intuitively grasp what “profile fatigue” is. For those who are thus afflicted, we don’t recommend the pictured Bromo Soda, even though it’s for brain fatigue. This is largely because it contained Bromide, which is chronically toxic and medications containing it were removed in the USA from 1975 (wow, fairly recent!).
Naturally, in the digital age, it’s important for researchers to have profiles and be associated with their work. Funding, citations and lots of other good career advancing benefits flow from this. And, it can be beneficial to showcase a broad range of output, so blogs, slide presentations, peer-reviewed publications, conference posters etc. are all fair game. It’s also best that a researcher’s work belongs uniquely to them, so profile systems need to solve for name disambiguation (no small undertaking!).
This is all well and good until you consider the number of profiles a researcher might have created at different sites already. To help us consider this, we put together this list.
Non-profit: independent, community driven
Publisher: Thomson Reuters
Scopus Author ID
Researcher Network: Academia.edu
Researcher Network: ResearchGate
The list shows that a researcher could have created (or have been assigned per SCOPUS) 7 “profiles” or more accurately, 7 online records of research contributions. That’s on top of those at their research institution and other organizations) and only one iD (helpfully shown in green at the top!) is run by an independent non-profit called ORCID.
Different from a profile, ORCID is a unique, persistent personal identifier a researcher uses as they publish, submit grants, upload datasets that connects them to information on other systems. But, not all other profile systems (sigh). Which leads us, once again, to the concept of “interoperability” which is one of the central arguments behind recent community disatissfaction over the new STM licenses which we have covered previously.
Put simply, if we all go off and do our own thing with licensing and profiling then we create more confusion and effort for researchers. Best to let organizations like Creative Commons and ORCID take care of making sure that everyone can play nicely in the sandbox (although they do appreciate community advocacy on these issues).
Interoperability is one good reason why ScienceOpen integrated our registration with ORCID and use their iD’s to provide researcher profiles on our site. We don’t do this because we think profiles are kinda neat, they are but they are also time consuming and tedious to prepare (especially 6 times!).
We did it because we are trying to improve peer-review which we believe should be done after publication by experts with at least 5 publications on their ORCID iD and we believe in minimizing researcher hassle. This is why our registration process is integrated with the creation of an ORCID iD, which could become pivotal for funders in the reaonably near future (so best for researchers to get on board with them now!).
So given that it seems likely that all researchers will need an ORCID iD (and boy it would be nice if they would get one by registering with us!), then what is also important is that all the sites listed in the above grid integrate with ORCID too and that hasn’t happened yet (you know who you are!). The others have done a nice job of integrating by all accounts.
In conclusion, publishers and other service providers need to remember that they serve the scientific community, not the other way around and this publisher would like to suggest that everyone in the grid please integrate with ORCID pronto!
At ScienceOpen, the research + publishing network, we’re enjoying some of the upsides of being the new kid on the Open Access (OA) block. Innovation and building on the experiments of others is easier when there’s less to lose but we are also the first to admit that life as a start-up is not for the faint hearted!
In the years since user generated comments and reviews were first introduced, those of us who strive to improve research communication have wrestled with questions such as: potential for career damage; content for peer and public audiences; comments from experts, everyone or a mix and lower than anticipated participation.
We want to acknowledge the many organizations who have done a tremendous job at showing different paths forward in this challenging space. Now it’s our turn to try.
Since launch, ScienceOpen has assigned members different user privileges based on their previous publishing history as verified by their ORCID ID. This seemed like a reasonable way to measure involvement in the field and provided the right level of publishing experience to understand the pitfalls of the process. This neat diagram encapsulates how it works.
Scientific and Expert Members of ScienceOpen can review all the content on the site which includes 1.3million+ OA articles and a very small number of our own articles (did we mention, we’re new!).
All reviews require a four point assessment (using five stars) of the level of: importance, validity, completeness and comprehensibility and there’s space to introduce and summarize the material. Inline annotation captures reviewer feedback during reading. Next up in the site release cycle, mechanisms to make it easy for authors to respond to in-line observations.
In a move sure to please busy researchers tired of participating without recognition, each review, including the subsequent dialogue, receives a Digital Object Identified (DOI) so that others can find and cite the analysis and the contribution becomes a registered part of the scientific debate.
Welcome to our wonderful world of Reviewing! Please share your feedback here or @Science_Open.