Following well received news earlier this week that we have enabled content filtering (over 10 million articles and records) on ScienceOpen by Altmetric scores (which measure social and mainstream media attention) and Citations, we’re delighted to share this convo between Euan (Altmetric) and Stephanie (ScienceOpen).
Euan: ScienceOpen is beginning to show up on our radar as a content aggregator. What is your goal with ScienceOpen and where are you heading?
Stephanie: Our goal has always been more open scholarly communication.
ScienceOpen is a freely accessible network for aggregating, sharing, and evaluating research information with over 10 million Open Access articles and bibliographic records. Moving forward our focus is on exposing the context of scholarly content. Powerful search and filtering tools, including the first publically available citation index and now the article Altmetric score, will help researchers rapidly find the literature they need.
Altmetric is also an information aggregator and has strongly influenced the debate on how to measure research impact. Altmetric is highlighting the benefits of Open Access in terms of increased attention by the scholarly community. I think it was at a conference coffee break when we talked about how it would be great to be able to search and filter by Altmetric score and now here we are – natural partners!
Euan: What first got you interested in altmetrics and why were you keen to add the Altmetric badges to the site?
At ScienceOpen, the individual research article is always at the center of what we develop. The Altmetric score provides unique insight into the quality and quantity of attention that a scholarly article has received. If citations represent the geneology of an idea, altmetrics tracks its dissemination. Together they give a fuller picture of the “impact” of an article – a tricky category but a worthwhile goal.
By making search results filterable by both citation numbers and Altmetric score, we can provide researchers with different entryways into the data – and that in and of itself may generate new ideas. That is why we were so interested in including the Altmetric badges on the site.
And of course we love the rainbow donuts!
Euan: How do you think that authors or researchers can make best use of your platform?
In three main ways.
It’s a great discovery resource. A search on ScienceOpen does not just pull up a list of article records, but rather a network of information. Topics and articles can be explored via authors, references, keywords, altmetrics, comments and more. Results can be narrowed and sorted and the search parameters saved. Most importantly, the research itself is center stage independent of publisher and journal. We strive to expose as much context for the research on our site as possible.
All content on the platform is available for Post-Publication Peer Review by scientific members with five or more peer-reviewed publications on their ORCID which helps maintain a high standard of discourse. Our larger goals here are to speed up the communication of science by moving its evaluation to after publication, to eliminate anonymity in the interests of transparency and to ensure that the conversation around science never ends. From our perspective, quality assurance does not end at the moment of publication.
ScienceOpen also appoints members of the research community to the role of Collection Editor and they curate articles from multiple publishers in any topic using a Collection tool. The big picture here is to complement the topical bundling done by individual journals and publishers with flexible post-publication collections across all scientific knowledge. The best papers can be included, regardless of whether published on a pre-print server or top journal. In this way we can support of the values espoused by DORA by developing alternatives to the Impact Factor.
Euan: On the technical side, the content you host – could you tell us where it comes from, and how much we’re talking about in terms of volume?
Our platform currently consists of over 10 million articles and records. We have imported to-date 950K full text Open Access articles from PubMed Central and roughly 830K records from arXiv. The additional roughly 8 million bibliographic records are extracted from the references within the full text content. We have started updating the records with the full metadata from external sources (currently PubMed ORCID) for better usability of the content. We also compare the references to ensure that we have good matching so we can merge reference data to create our citation index.
Euan: Where will site visitors be able to find the badges/what can they expect to see?
Researchers will find the Altmetric badge both on the search results page, where they can filter their search by Altmetric score to find the most talked-about paper in their field, as well as on each individual article page as part of the article metrics. When researchers have landed on a paper of interest, they can drill down to find out exactly what aspect of the research people are talking about. The score itself is just a starting point to discover more and we would hope that researchers would treat it critically as with any metric.
As we continue to develop the site we may find unique ways to present the Altmetric score such as an aggregated Altmetric score for collections.
Euan: Do you offer any other article level metrics?
We are committed to providing as much context to an article as possible and article level metrics are central to this mission. On each article page we have a summary box that displays reader numbers on ScienceOpen, citations, post-publication reviews, comments, recommendations and shares.
Search results from within the 10 million articles and records on the site can be filtered by reader count, review rating and, most recently, number of citations.
We have taken the first steps towards a publically available citation index, something that the scientific community truly needs. Researchers can sort their search results by citation number, view the reference list sorted by citation and see other articles by same author, with more contextual information to come. These citation numbers are correct (in a relative not an absolute sense) and can be very useful together with the Altmetric badge to quickly sort articles based on attention by the scientific community.
Euan: Would you like a Donut?
I don’t mind if I do. By return, here’s a 41 second video “How to filter your search by Altmetric” complete with groovy Berlin Techno soundtrack (nice going Dan Cook!).
Publishers can join in by indexing their journal content across all research disciplines (now including the humanities and social sciences) and license types on ScienceOpen for enhanced visibility within a wider academic context. This unique service is open for all researchers worldwide and will be launched at the Frankfurt Book Fair this week.
Since its launch in 2014, ScienceOpen has exponentially grown its database to allow researchers to more easily navigate, search and comment on scientific articles. A search on ScienceOpen does not just pull up a list of article records, but rather a network of information.
Topics and articles can be explored via authors, references, keywords, altmetrics, comments and more. Results can be narrowed and sorted and the search parameters saved. Content – popping up in the context of such search & navigation – is pulled center stage independent of publisher and journal.
Stephanie Dawson, CEO of ScienceOpen says:
All these features provide a superior search experience for researchers and advantages for publishers in having their content and brand promoted. With this new offering, we are expanding Open Access to indexing information at the point of (re)search.
The ScienceOpen network is freely accessible for researchers to join, search, discover and share. This new feature will be introduced at the Frankfurt Book Fair (FFBF) this week, the world largest event for academic publishers worldwide. Talk to our team in and around Hall 4 at the FFBF to learn how to include your concent and benefit from this fast growing index.
There are many things in life that are (arguably) better in the digital age. Many of these improvements we take for granted: no longer getting lost traveling from A to B thanks to Google Maps; locating errant teenagers using their phone GPS ; reading the NYT on the go; reaching out to powerful (and less so) people on Twitter or interacting with family and friends using Facebook. Overall, there appears to be a greater sense of transparency in our own lives and those of others.
Just as with any British Firework display on the 5th of November (Guy Fawkes Night), we’ve saved the best until last!
Here at ScienceOpen we wear three hats: Publisher, Aggregator and Reformer and it’s in this final regard that we take the most pride.
Earlier this year, Jan Velterop, a thought leader in scholarly publishing, wrote to me and shared his proposal for Peer Review by Endorsement and wondered if ScienceOpen might be interested in making his long standing wish a reality.
No sooner had he written, but he found himself added to the Advisory Board and we announced our plans to add this process to our existing methodology (those with 5 or more Peer Reviewed publications per ORCID can become members of the network and review content).
Now, a few months later, for the first Peer Review Week, Jan has published a juicy Opinion (we publish all types of articles, not just original) with us entitled:
For our part, we have added instructions on how to publish this way to the site. Why do we like this idea? Because rather than publisher-mediated peer review before publication, the scientific community takes this role and the publisher verifies the results. As Jan puts it:
It is more efficient and cost effective to hand peer review entirely back to the scientific community, where it rightly belongs, than for publishers to find the right, appropriate, available, reliable, expert reviewers.
Whether you prefer to get your work professionally evaluated before you publish it and afterwards, or simply leave it all until after publication, the choice is yours and the choice is now (still time to try this process and get your paper published before the end of 2015!).
We talk a lot about peer review in the scholarly communications world. Many of us – and our organizations – are working to improve both the process and the experience for researchers, which has led to a significant increase in the range of options available, especially – but not exclusively – for reviewing journal articles. From double blind to completely open review, pre- and/or post-publication, and even transferrable peer review, not to mention the work being done on peer review recognition and validation by organizations like Publons and PRE, there’s a plethora of new approaches and services to choose from.
But what do researchers make of all this? What are their experiences of peer review? How and why do they review themselves, and what do they get from reviews of their own work? In this reflection from researchers around the world, we asked some of them to tell us about their views of peer review.
By and large, their feedback was very positive, with good experiences outweighing bad and universal agreement that peer review is, as Elizabeth Briody of Cultural Keys, USA, says: “a critically important process for evaluating the merit, content, relevance, and usefulness of scholarly publications” – or as Hugh Jarvis, Cybrarian, University at Buffalo, USA, describes it: “Peer review is the glue of academic publishing.” Saurabh Sinha, Executive Dean, Faculty of Engineering & the Built Environment, University of Johannesburg, South Africa agrees that: “it positions our work with respect to the body of already published knowledge. The approach also helps to ensure, as far as possible, the correctness of the work, elimination of potential blind spots, and validity of assumptions for a practical world.”
Pretty much everyone noted the importance of peer review – both as reviewer and author – to them personally as well as professionally. For example, Professor Yongcheng Hu, a medical researcher in China commented that: “Peer review is an essential arbiter of scientific quality, no doubt, it has a great impact on scientific communication and is of great value in determining academic papers’ suitability for publication, while for me, via personal experience, it is also an process of exploration and sublimation.” Erik Ingelson, Professor of Molecular Epidemiology at Uppsala University in Sweden, currently Visiting Professor at Stanford University, USA adds: “Mostly, my experiences of being a reviewer have been positive; I get to think critically about study design and methods and learn new things on the way. Similarly, most of the time the review process is positive also as the author, since you get valuable input and the paper that comes out is often better than the original submission.” Anna Cupani, a Belgian researcher, agrees: “Having someone reading and commenting on your research is beneficial for several reasons: it validates your work, it confirms what you are doing is meaningful not only for you but for a wider scientific audience and it helps you focus and improve your research. You never grasp the meaning of something as deeply as when you have to explain it to someone else!” And Lee Pooi See, Associate Chair (Research), School of Materials Science and Engineering, Nanyang Technological University, Singapore adds: “My personal experience of being reviewed has been interesting; especially in receiving scientific viewpoints from different reviewers on emerging topics. Peer review also steers us to identify those unaddressed aspects of the related research topics.
Several people also commented that there are upsides and downsides to peer review. Janine Milbradt, who is currently working on her PhD at the Institute for Human Genetics, University of Cologne, Germany, says: “You never know what is going to happen! All you can be sure about is that you will have to put another 3-6 months of work into your paper. Having a paper reviewed is a nerve-stretching process, filled with hopes and dreams about the reviewers actually liking your research. On a more serious note, the review process is a very important tool to find incomprehensible or knowledge lacking parts of your research to improve your paper.” Professor Wong Limsoon, KITHCT Professor of Computer Science, National University of Singapore comments: “I appreciate very much constructive reviews that gave me really useful suggestions on my work. I am sometimes annoyed by uninformed comments, but fortunately these are few.”
So what improvements to peer review would our group of researchers like to see? To quote Professor Sinha again: “Scholarly peer-review has…the opportunity to improve beyond the past, where today, coupled with data, crowd-sourced reviews/discussion, newer open-access technologies could play a dynamic role of developing credibility of research-work and at the same time increasing competition!” Hugh Jarvis likewise has “great hopes that peer review will develop a much more expanded role in the future, and provide input before and after publication, similar to the role the comments serve in Current Anthropology and the product ratings in sites like Amazon.com.” And Joao Bosco Pesquero, Professor, Federal University of Sao Paulo, Brazil would also like to see a more open approach: “The more openly we produce science and expose our work to criticism, the more it helps to improve what we do.”
Perhaps the best summary of why researchers continue to value peer review – both as authors and as reviewers – comes from PhD student, Grace Pold of UMass – Amherst, USA, who told us: “Although I have had the opportunity to formally review only four or five papers, reviewing papers is one of my favorite things to do. First off, it is a good reminder that not all papers are born perfect, and when I am struggling to try and finish my own work and the prospect of a well-polished manuscript seems too far in the distance, it gives me hope. Second, is there a better opportunity to see what your colleagues are working on and thinking about than by reviewing their work? Third, the idea of being able to help shape the information released into the public sphere is a very enticing. Fourth, it is a great excuse to really think about the assumptions you and others make in your research…when you review, it is your responsibility to stop and think about why this is the way things are done. Fifth, thinking up alternative interpretations and then filtering through the data presented in the paper to determine the robustness of the conclusions is a rewarding challenge. Finally, reviewing papers provides an opportunity to slow-down and formulate a full, well-rounded opinion on something, something which happens unfortunately rarely in the life of the frantic modern scientist stuck in with the nitty gritty details of doing experiments. And I think that from a personal perspective, that final point of generating a sense of accomplishment in doing a good job in thinking things through to the end is probably the greatest motivation for me to review papers.”
Imagine if you will a perfect world where all knowledge is openly available to use and share without restriction. This might seem like a bit of a stretch most days but bear with me here!
Believe that the content narrative continues to move beyond the confines of today’s mainly static article. That an ongoing stream of results, data, figures and ideas flows for transparent review and discussion. In short, that a reductionist approach to scientific communication prevails which renders journals with their slow publication cycles and impact factors obsolete.
It’s not that hard to see the evidence of these trends already. Think about the rise of blogs and social media as suitable places for scientific discussion, the growing importance of continuous publication, data sharing and interactive figures. All this in the pursuit of making research and researchers themselves more visible, as they deserve to be.
This Peer Review Week, ScienceOpen wants to pose a simple question. As the number of research outputs grow and diversify (data sets, negative results, case reports, preprints, posters…) is the research community going to be able to peer-review all these objects prior to publication?
We think not. There isn’t enough time in the day, money to pay for it or even appetite for doing this now. Will these outputs be useful none-the-less? Absolutely, if we have a powerful way to find and filter them based on parameters readers find helpful and authors find rewarding. For example:
What do my peers think of this information?
Are there any updates to it?
What impact did it make in the world and who noticed?
Which work is worth highlighting in a specific field?
How many times was it cited and where?
If I took the time to review it, can my contribution be found and cited?
Will these efforts enhance my career prospects?
How many times was it cited and where?
None of these valid questions are impacted by an evolution away from blind or double-blind anonymous peer review, apart from the speed with which we can answer them. Transparent processes and simple web tools can filter faster, better and cheaper than journals and pre-publication peer review ever could.
This is why at ScienceOpen we’ve developed systems for Post-Publication Peer Review; Versioning; DOI allocation; Article Metrics; Collections; Open Citation Information and more – to demonstrate a different (and we would argue better) way forwards.
This inaugrual Peer Review Week, we invite you to consider this argument and disagree with us by all means. We look forward to a lively and spirited debate!
Life in California is good. Truthfully, that’s an understatement. As an ex-pat Brit, it’s great. Public holidays are rarely marred by rain; tomatoes grow outdoors (as do Oranges and Avocados); every work day is “casual Friday”.
There’s really only one downside, and that’s our time zone which means that in terms of the global conversation, we are constantly last to the party!
And so it goes with the first ever Peer Review Week. As the “lady at the helm” for social media, it’s lunchtime here in San Francisco and I am frantically trying to catch up with all the stories that everyone else has already posted.
Rather than give you an exhaustive list of the conversations and coverage, which you can see for yourself from #peerrevwk15, I am going to highlight a few that particularly stood out from me.
Listen to this podcast by Chris O’Neil from Bioscientifica which begins with a truism “none of us like the peer review process”! He goes onto explain that despite this visceral reaction, that most researchers accept that their article is improved by it.
These articles (and those that we hope to publish) are curated by Professor Friedrich C. Luft, Director of the Experimental and Clinical Research Center (ECRC) at Charite and Max Delbrueck Centre in the Helmholtz Association in Berlin, Germany and Dr. Nana Bit-Avragim, Physician-Scientist and Open Access Advocate.
Clinical case reports remain an essential part of lifelong learning in medicine. Reading at least one a day allows clinicians to hone their differential diagnosis skills beyond their own immediate bedside. Indeed, this knowledge is so vital for shaping the best patient outcomes that it deserves to be openly published so that everyone, regardless of their resources, can read and re-use it as they wish enabling them to:
Share interesting and unique disease manifestations and diagnostic methods
Provide invaluable first-hand source of evidence about general and novel therapeutic approaches across the globe
Help identify life threatening adverse reactions to medications
Exchange practice information and generate a wider search for evidence
To all clinicians out there we say “unlock your education doc!” by openly reviewing the articles in this collection or any that you find interesting from nearly 10 million (open articles and toll stubs) items of content on the platform. If that number sounds a bit intimidating, then remember that we have sophisticated search tools (<3 minute video), including an open citation index, to help you find exactly what you are looking for.
What do we hope to achieve at this year’s COASP meeting? Stephanie (our CEO) would like to chat with as many of our fellow publishers as possible about how you, like Thieme, can use our platform to raise the visibility of the OA research that you publish even further, which is of benefit to your authors and their careers.
As some, but not all, of you know, ScienceOpen has developed a Collections Tool which allows Community Editors to curate articles from any OA publisher in any way they choose.
We offer a safe and legal networking option for encouraging conversation around content, that complies with publisher policies. ScienceOpen invites those attending COASP to find Stephanie (@SDawsonBerlin) and get involved!
Peer review – the evaluation of scientific, academic, or professional work by others working in the same field – lies at the heart of scholarly communication.
At its best, Peer Review is a rigorous analysis with the aim of improving either the article itself or the science behind it and frequently both. At its worst, peer review is an obstacle course, that appears engineered to prevent publication or at best delay it by months, or even years!
This dichotomy of author experience plus the ever increasing publicity attached to retractions are both real issues in terms of faith in the process itself and public trust in science. It seems prudent then that we should act in a cohesive manner to see what improvements can be made whilst still acknowledging the important role that Peer Review plays.
Peer Review Week grew out of informal conversations between ORCID, ScienceOpen, and Wiley. Each organization has a different perspective on peer review, and has been working independently to better support its role in scholarly communications. Joining forces enables all three organizations to share their central message – that good peer review, whatever shape or form it might take, is critical to scholarly communications – more widely and powerfully. Sense About Science has joined the week to ensure the wider benefits of peer review – as a quality mark and tool for making sense of science claims – are shared with the public.
Our informal partnership will promote the first ever Peer Review Week, from Monday 9.28 thru Friday 10.2.
During this time we’ll be sharing stories, videos, participating in a Webinar on Trust and Transparency in Peer Review (Kent Anderson, Alexander Grossmann, Laure Haak, Andrew Preston and Verity Brown) and a Twitter campaign with the hashtag #peerrevwk15. We also invite other orgnanizations working in this space, such as The Winnower; PeerJ; F1000Research; BMC; Publons; PubPeer; and more to participate in this virtual campaign.
Here’s our position on peer review at ScienceOpen and we know that everyone doesn’t agree with us!
Our goal is to augment trust in the peer review process by making it entirely transparent. We facilitate Post-Publication Peer Review from named individual experts with 5 or more peer-reviewed publications listed on their ORCID to nearly 10 million open access articles and toll stubs currently available on the platform. We’re delighted to support this inaugural Peer Review Week.