Blog
About

Tag: peer review

In:  Announcements  

MyScienceOpen: The only networking platform you’ll ever need

Today, we are happy to announce our launch of MyScienceOpen, the professional networking platform designed for a modern research environment.

Since 2013, ScienceOpen has been leading innovation in advanced search and discovery, open peer review, and content management. Now, by leveraging the power of ORCID, we bring you our latest service for helping researchers to make an impact in the open.

MyScienceOpen is an integrated profile where academics can visualize their research impact through enhanced author-level metrics. Key new features include:

  • Interactive visualizations of an author’s readers, citations, and Altmetric scores for all their publications through time;
  • Addition of non-specialist article summaries, disciplines, keywords, and images to their article records;
  • Extraction of articles as RIS, BibTex and Endnote citation metadata for your reference managers;
  • Enhanced statistics for collection Editors to track usage.
New features to enhance your article visibility!

Making an impact in a research discovery ecosystem

We designed these new features for you to make an increased impact, and keep track as your research progresses. All of this is provided to you within the context of a discovery environment of more than 31 million article records. It just makes sense to have these profile and article enhancement features integrated into an ecosystem where people are actually discovering and re-using research. And for free, of course.

Continue reading “MyScienceOpen: The only networking platform you’ll ever need”  

In:  Peer Review  
Increasing academic support for Open Peer Review

Increasing academic support for Open Peer Review

‘Open research’ isn’t just about sharing resources like data, code, and papers, although this is a big part of it. One big, and often under-appreciated aspect of it is about making research accessible, inclusive, and participatory. A major principle driving this is leveraging transparency to bring processes and factors that are currently hidden into public view.

One area of research and scholarly communication where the debate is still very much ongoing for this is for peer review – our system of validation and gatekeeping to the vast archives of public knowledge.

OpenAIRE have released an important new survey and analysis on attitudes and experiences towards ‘Open Peer Review’ (OPR), based on more than 3000 respondents (full data available here to play with). This is important, as OPR is all about the principles above – making the process transparent, collaborative, inclusive, and in the end, better!

Below, we discuss some of the major findings of the survey, and how we at ScienceOpen fit into the bigger picture of Open Peer Review.

The future is Open

The main result of the survey is that the majority (60.3%) of respondents are in favour of OPR becoming a mainstream scholarly practice, particularly regarding open interaction, open reports and final-version commenting. Part of this is due to the relatively lower satisfaction scores reported, with just 56.4% of respondents being satisfied with traditional closed peer review, and 20.6% being dissatisfied – a much lower gap than all previous reports. From the survey, more than three quarters of respondents had previously engaged with OPR either as an author, reviewer, or editor. This suggests that OPR, in one form or another, is already probably more common practice than we might think.

Interestingly, this development is similar to what we saw with other aspects of ‘open science’ such as open access and open data – there is debate, experimentation, variable implementation, and finally they start to become accepted as the norm as policies, practices, and cultures adapt. The survey also showed that 88.2% of respondents were in favour of Open Access to publications, a much higher value than several years ago. It also found that support for OPR is correlated with support for Open Data and Open Access, which is perhaps not surprising, although conversations regarding OPR are still in their relative infancy.

This suggests that as debates around OPR mature, we are likely to see an increase in the uptake and support of it, as with other areas of ‘Open’. Indeed, the survey also found a difference in generational support for OPR, with younger generations favouring it more over more-established researchers. As it is these generations who will inherit and govern the system in the future, it is more likely to have the characteristics that they favour.

Continue reading “Increasing academic support for Open Peer Review”  

In:  Peer Review  

Defining Open Peer Review at ScienceOpen

Recently, our colleagues at OpenAIRE have published a systematic review of ‘Open Peer Review’ (OPR). As part of this, they defined seven consistent traits of OPR, which we thought sounded like a remarkably good opportunity to help clarify how peer review works at ScienceOpen.

At ScienceOpen, we have over 31 million article records all available for public, post-publication peer review (PPPR), more than 3 million of which are full-text Open Access. This functionality is a response to increasing calls for continuous moderation of the published research literature, a consistent questioning of the functionality of the traditional peer review model (some examples in this post), and an increasing recognition that scientific discourse does not stop at the ‘event’ point of publication for any research article.

Peer Review at ScienceOpen – A brief summary

Open participation

At ScienceOpen, we invite the whole scientific community to contribute to the review process, should they wish to. The only requirement is that the person has to be registered at ORCID and have at least five publications assigned to their ORCID account to write a review (Scientific Members and Experts). If you do not satisfy these requirements and wish to perform a peer review at ScienceOpen, please contact us and we will make an exception for you.

Users with at least one publication assigned to their ORCID account are able to comment on a paper (Members). Please refer to our User categories for further details.

We also encourage users to use our ‘Invite to review’ function (see below), which is available on more than 31 million article records. We know that editorial control will always be a critical aspect of any open peer review system, including PPPR, and therefore encourage collection Editors to solicit peer reviews for articles within their collections.

Continue reading “Defining Open Peer Review at ScienceOpen”  

In:  Peer Review  

Peer Review Week 2017 is all about transparency

At ScienceOpen, we have been pushing for greater transparency in peer review since our inception. We inject transparency at multiple levels, by identifying referees, publishing reports, providing formal recognition for contributions, and encouraging open interaction on our platform (more details here).

This is why we’re extremely stoked to see the theme for this year’s Peer Review Week to be all around the theme of transparency!

The idea for the first Peer Review Week, held in 2015, grew out of informal conversations between ORCID, ScienceOpen, PRE (Peer Review Evaluation), Sense About Science, and Wiley, the organizations that planned and launched the initiative in 2015. Last year, ScienceOpen hosted a webinar with Nature, PaperHive, and Publons along the theme of recgnising review.

In 2017, we are helping to organise a session at the Peer Review Congress to help showcase what peer review should look like when it works. We look forward to working with the other partner organisations and the global scholarly community in helping to make peer review a fairer, more transparent process.

Continue reading “Peer Review Week 2017 is all about transparency”  

In:  About SO  

ScienceOpen is a resource for the community

A core concept for our evolving understanding of open research and scholarship is that of equity and fairness within the global research community. At ScienceOpen, this is something we strongly believe in, and work together with a range of publishers and researchers to play our part in making this a reality for research.

As part of our mission, we therefore try to break down barriers in research, and prefer to build bridges over walls. Here are just some examples of how we do this, and in doing so contribute to building a platform that acts as a social community space for all researchers.

We harvest content from across platforms like PubMed Central, arXiv, SciELO and bring it all together in one place

One of the main features of ScienceOpen is that we are a research aggregator. We don’t select what we index based on discipline, publisher, or geography, as that just creates another silo. Enough of those exist already. What we need, and what we do, is to bring together research articles from across publishers and other platforms and into one space, where it is all treated in exactly the same way.

When you have articles displayed in this way, factors such as journal brands and impact factors play less importance than the actual content itself. Users can make their own choices about what to read, review, share, and re-use based on their own expertise and evaluation, or the social context provided by our other users.

We also don’t just focus on the hard sciences or the humanities and social sciences. Too often are the main fields of research and disciplines segregated from each other, rather than being used together in inter-disciplinary harmony. This is why we integrate research from across fields and at different levels, such as with the fantastic Open Library of Humanities, and also more recently a whole range of new content to help emphasise this from Materials Science, Biomedical Science, Entomology, Archaeology, Medical and Health Research, and er, dinosaurs.

Last year, SciELO integrated more than 500,000 Open Access articles with us from across Latin America, for the first time putting all of this research on the same level as that from research contained within PubMed Central. There is no reason why there should be geographical segregation of research across platforms. We believe that all research deserves to be read and re-used by anyone, irrespective of where that research was conducted and who published it.

Open Access isn’t just about access to knowledge, but also principles of equality, and to achieve that we have to recognize the value of research from around the world.

Continue reading “ScienceOpen is a resource for the community”  

In:  Peer Review  

A new gold standard of peer review is needed

How can something exclusive, secretive, and irreproducible be considered to be objective? How can something exclusive, secretive, and irreproducible be considered as a ‘gold standard’ of any sort?

Traditional, closed peer review has these traits, but yet for some reason held in esteem as the most rigorous and objective standard of research and knowledge generation that we have. Peer review fails peer review, and its own test of integrity and validation, and is one of the greatest ironies of the academic world.

What we need is a new standard of peer review that is suitable for a Web-based world of scholarly communication. This is to help accommodate the increasingly rapid communication of research and new sources of information, and bring peer review out of the dark (literally) ages and into one which makes sense in a world of fast, open, digital knowledge dissemination.

What should a standard for peer review look like in 2017?

The big test for peer review, and any future version of it, is how does the scientific community apply its stamp of approval?

Continue reading “A new gold standard of peer review is needed”  

In:  Peer Review  

What are the barriers to post-publication peer review?

At ScienceOpen, we have over 28 million article records all available for public, post-publication peer review (PPPR), 3 million of which are full-text Open Access. This functionality is a response to increasing calls for continuous moderation of the published research literature, a consistent questioning of the functionality of the traditional peer review model (some examples in this post), and an increasing recognition that scientific discourse does not stop at the point of publication for any research article.

Post-publication peer review at ScienceOpen in action!

In spite of this increasing demand, the uptake of PPPR across different platforms seems to be relatively low overall. So what are some of the main reasons why researchers might feel less motivated to do PPPR, and is there anything we can do to increase its usage and adoption as part of a more open research culture?

What even is ‘post-publication’ peer review?

There is a general mentality among researchers that once research has been published, it has already ‘passed’ peer review, so why should it need to be peer reviewed again?

Continue reading “What are the barriers to post-publication peer review?”  

In:  Peer Review  

A post-publication peer review success story

 

In 2016, Dr. Joel Pitt and Prof. Helene Hill published an important paper in ScienceOpen Research. In their paper, they propose new statistical methods to detect scientific fraudulent data. Pitt and Hill demonstrate the use of their method on a single case of suspected fraud. Crucially, in their excellent effort to combat fraud, Pitt and Hill make the raw data on which they tested their method publicly available on the Open Science Framework (OSF). Considering that a single case of scientific fraud can cost institutions and private citizens a huge amount of money, their result is provocative, and it emphasizes how important it is to make the raw data of research papers publicly available.

The Pitt and Hill (2016) article was read and downloaded almost 100 times a day since its publication on ScienceOpen. More importantly, it now has 7 independent post-publication peer reviews and 5 comments. Although this is a single paper in ScienceOpen’s vast index of 28 million research articles (all open to post-publication peer review!), the story of how this article got so much attention is worth re-telling.

Enhanced article-level statistics and context – just one of 28 million on our platform!

Peer review history

The manuscript was submitted and published in January 2016, and the final typeset version of the article was available for download on March 1st. Shortly after this in May 2016, PhD student Chris Hartgerink publicly peer reviewed the article, summarising it as “Interesting research, but in need of substantial rewriting.”

It was after this that the article came to the attention of Prof. Philip B. Stark, an Associate Dean at the University of California, Berkeley, and author of the most highly read article on our platform with over 39,000 views to date!

Prof. Stark runs a course on the theory and application of statistical models. In his course, groups of students replicate and critique the statistical analyses of published research articles using the article’s publicly available raw data. Obviously, for this course to work, Prof. Stark needs rigorous research articles and the raw data used in the article. In this sense, Pitt and Hill’s article on ScienceOpen was an ideal candidate..

The groups of students started their critical replication of the Hill and Pitt article in the Fall semester of 2016 and finished right before the new year.  By getting students to actively engage with research, they gain the confidence and expertise to critically analyse published research.

The Post-Publication Peer Review function on ScienceOpen is usually only open to researchers with more than 5 published articles. This would have normally barred Stark’s groups from publishing their critical replications. However, upon hearing about his amazing initiative, ScienceOpen opened their review function to each of Prof. Stark’s vetted early career researchers. And importantly, since each peer review on ScienceOpen is assigned a CrossRef DOI along with a CC-BY license, after posting their reviews, each member of the group has officially shared their very own scientific publication. 

This also means that each peer review can be easily imported into any user’s ORCID, Publons, and even ImpactStory profiles – the choice is yours!

Public, post-publication peer review works

All of the complete peer reviews from the groups of students can be found below. They all come with highly detailed statistical analyses of the research, and are thorough, constructive, and critical, as we expect an open peer review process to be.

Furthermore, unlike almost every other Post Publication Peer Review function out there, the peer reviews on ScienceOpen are integrated with graphics and plots. This awesome feature was added specifically for Prof. Stark’s course, but note that it is now available for any peer review on ScienceOpen.

Continue reading “A post-publication peer review success story”  

In:  Other  

Welcome to ScienceOpen version 2.017

Kick off the new year with the new unified search on ScienceOpen! We have accomplished a lot over the last year and are looking forward to supporting the academic community in 2017.

In 2016 ScienceOpen brought you more context: Now your search comes with a new analytics bar that breaks down your search results by collections, journals, publishers, disciplines, and keywords for quicker filtering. Try a search for the pressing topics of 2016 like Zika or CRISPR and take the new features for a spin.

Researcher output, journal content, reference lists, citing articles can all be dynamically sorted and explored via Altmetric score, citations, date, activity. Statistics for journals, publishers and authors give overview of the content that we are indexing on ScienceOpen. Check out the most relevant journals on ScienceOpen, for example BMC Infectious Diseases or PloS Genetics for a new perspective. Or add your publications to your ORCID and get a dynamic view of your own output.

Image by Epic Fireworks, Flickr, CC BY

In 2016 ScienceOpen brought you more content: We welcomed publisher customers across the entire spectrum of disciplines to ScienceOpen and expect many more for the upcoming year. We added multiple journals from Brill, River Publishers, Open Library of Humanities, Higher Education Press and featured collections for PeerJ Computer Science, Cold Spring Harbor Laboratory Press Molecular Case Studies and the Italian Society for Victimology. We had the pleasure to work with a very diverse group, from STM to HSS, from open access to subscription-based journals, creating interdisciplinary bridges and new connections for their content. We further integrated all of SciELO on ScienceOpen this year for a more global perspective and have had a great time working with them. We are at over 27 million article records and adding content every day.

In 2016 ScienceOpen brought you more open: The ScienceOpen team participated in and helped organize numerous community events promoting Open Science. From Peer Review Week to OpenCon, talks at SSP in Vancouver and SpotOn in London, our team was on the road, debating hot issues in scholarly communication.

In order to bring more visibility to smaller community open access journals, very often with close to non-existent funding and run on a voluntary basis, we launched our platinum indexing competition. It was geared towards open access journals charging no APCs to their authors. Four successful rounds in, we have selected 18 journals to be indexed and awarded some of them with special featured collections on the ScienceOpen platform. This activity was particularly rewarding as we heard back from journals’ editors expressing their enthusiasm about the ScienceOpen project and enjoying bigger usage numbers on their content.

The ScienceOpen 2.017 version will continue to focus on context, content and open science. We are your starting point for academic discovery and networking. Together let’s explore new ways to support visibility for your publications, promote peer review, improve search and discovery and facilitate collection building. Here is to putting research in context! The year 2016 had some great moments – may 2017 bring many, many more!

Your ScienceOpen team

In:  Peer Review  

The winner of our Peer Review Week 2016 competition

For Peer Review Week 2016, we set a simple competition for you all, to publicly peer review one of 25 million research articles on our platform. This fitted perfectly with the theme this year of ‘Recognising Review’, as every single peer review conducted with us is published openly and creditable through the application of a CC BY license, which enables the unrestricted sharing and re-use of the reviews providing that attribution is given.

We’re happy to announce that Lauren Collister was the winner this year, and a t-shirt is on your way!

1

Lauren performed a civil, constructive, and detailed peer review of a paper entitled Crowdsourcing Language Change with Smartphone Applications. This article is also part of our Language Change collection, created by George Walkden.

2

We now have 118 open post-publication peer reviews on our platform. Each one is citable with CrossRef DOIs and can be interlinked with PublonsORCID, and ImpactStory, helping to build you profile as a researcher. This is a practical example that this form of peer review works!

  Previous page
1234