Blog
About

Author: Stephanie Dawson

In:  Other  

Welcome to ScienceOpen version 2.017

Kick off the new year with the new unified search on ScienceOpen! We have accomplished a lot over the last year and are looking forward to supporting the academic community in 2017.

In 2016 ScienceOpen brought you more context: Now your search comes with a new analytics bar that breaks down your search results by collections, journals, publishers, disciplines, and keywords for quicker filtering. Try a search for the pressing topics of 2016 like Zika or CRISPR and take the new features for a spin.

Researcher output, journal content, reference lists, citing articles can all be dynamically sorted and explored via Altmetric score, citations, date, activity. Statistics for journals, publishers and authors give overview of the content that we are indexing on ScienceOpen. Check out the most relevant journals on ScienceOpen, for example BMC Infectious Diseases or PloS Genetics for a new perspective. Or add your publications to your ORCID and get a dynamic view of your own output.

Image by Epic Fireworks, Flickr, CC BY

In 2016 ScienceOpen brought you more content: We welcomed publisher customers across the entire spectrum of disciplines to ScienceOpen and expect many more for the upcoming year. We added multiple journals from Brill, River Publishers, Open Library of Humanities, Higher Education Press and featured collections for PeerJ Computer Science, Cold Spring Harbor Laboratory Press Molecular Case Studies and the Italian Society for Victimology. We had the pleasure to work with a very diverse group, from STM to HSS, from open access to subscription-based journals, creating interdisciplinary bridges and new connections for their content. We further integrated all of SciELO on ScienceOpen this year for a more global perspective and have had a great time working with them. We are at over 27 million article records and adding content every day.

In 2016 ScienceOpen brought you more open: The ScienceOpen team participated in and helped organize numerous community events promoting Open Science. From Peer Review Week to OpenCon, talks at SSP in Vancouver and SpotOn in London, our team was on the road, debating hot issues in scholarly communication.

In order to bring more visibility to smaller community open access journals, very often with close to non-existent funding and run on a voluntary basis, we launched our platinum indexing competition. It was geared towards open access journals charging no APCs to their authors. Four successful rounds in, we have selected 18 journals to be indexed and awarded some of them with special featured collections on the ScienceOpen platform. This activity was particularly rewarding as we heard back from journals’ editors expressing their enthusiasm about the ScienceOpen project and enjoying bigger usage numbers on their content.

The ScienceOpen 2.017 version will continue to focus on context, content and open science. We are your starting point for academic discovery and networking. Together let’s explore new ways to support visibility for your publications, promote peer review, improve search and discovery and facilitate collection building. Here is to putting research in context! The year 2016 had some great moments – may 2017 bring many, many more!

Your ScienceOpen team

In:  About SO  

Dynamic view of author’s works on ScienceOpen profile

With the launch of our new unified search interface, we restructured the Author Profile page on ScienceOpen, providing dynamic ways to explore an author’s output.

For a very prolific author like Ray Dolan, Director of the Wellcome Trust Centre for Neuroimaging at UCL and author of 674 articles, it can be hard work for a reader to even just scroll through the titles of his total output. The new ScienceOpen author profile, however, provides the researcher a variety of avenues to delve into this content on their own terms. They can sort publications by Altmetric score, citations, usage, date or reviews – to find the view that fits their needs.

New enhanced author profiles!

The left side-bar overview shows top collections, journals, publishers, keywords and disciplines. Users can also search within the publication list with a free-text search or add up to 14 filters to find exactly the content that is relevant to them

The top metrics bar provides a view on total usage of the articles on the site and activity by the author. And if you want to know more about the background of the author just click on the profile button for biography and more.

How does it work? From the beginning ScienceOpen has worked closely with ORCID and required an ORCID ID for active participation in the network. We draw our information therefore from a user’s public profile. If we detect an author who is not identified in our network with an ORCID (we are tracking nearly 15 million authors), we mark the profile as “record” to indicate a lower level of reliability; for example, this profile from Jonathan A. Eisen:

Integrate your ORCID account to activate your full profile record
Useful author-level metrics and context

Below are several examples of interesting profiles on ScienceOpen to inspire you. We welcome you to search, explore, link your ORCID to your own profile and share your experience with us. At ScienceOpen we are striving to serve the academic community and always welcome your input.

Alexander Grossmann

Philip Stark

Yang Gan

Thomas Rosenau

People making a difference in 2016: Open Science Stars

Happy Holidays from ScienceOpen!

As our thank you to all of our wonderful members and users, this year we have decided to give you a special gift. We’ve taken each of the individual interviews from our Open Science Stars series, which documents a range of experiences and perspectives into the world of Open Science, and assembled them here for you in one collection to download here: OPEN SCIENCE STARS. Only by listening to and understanding truly diverse voices can we gain a deeper appreciation of the issues surrounding Open Science. By taking on board what others have to say and learning from them, we strengthen ourselves and the community, and understand how to put things into practice more easily.

Kind regards,

The ScienceOpen team

Download: OPEN SCIENCE STARS

In:  Collections  

The Time is Ripe for ScienceOpen Collections

Image Credit: Tsuji, Flickr CC BY NC-SA

With nearly 2 million scholarly articles published each year and very limited time (squeezed in between grant proposals, departmental reviews, teaching, writing and the occasional family dinner!), researchers have to pick and choose carefully which articles they read. Recommendation by trusted colleagues is one of the most important filters used by researchers to make decisions on where to focus their attention. This is where ScienceOpen Collections come in!

An academic journal provides topic-specific bundling, editorial selection, quality assurance and often a sense of community. But with shrinking library budgets, spiraling subscriptions prices and new digital tools, it may be time to look for an alternative. Why not facilitate experts themselves to create “virtual journals” after publication drawing from all available articles, regardless of publisher or journal? Readers will still enjoy the authority and selection of thought leaders, authors can enjoy the prestige of having their article “included” and the cost to the library – zero. Plus, shifting prestige to post-publication structures can also prevent “sky-is-the-limit” APCs for fancy brand journals as we move towards more Open Access.

The ScienceOpen Collections offer an expert selection of academic articles across all journals to bring out those hidden gems and undervalued new hypotheses. With post-publication peer review, rating tools and discussion forums, they also invite the reader to contribute – and on ScienceOpen every peer review report is treated as a citable published article with a CrossRef DOI.

This week we are launching several new collections. Professor Dr. Barry Marshall won the Nobel Prize for his discovery of the bacteria Helicobacter pylori and its role in gastric ulcers in 2005. In his collection “When did Helicobacter first colonise humans?” on ScienceOpen he explores the evolution and history of both the bacteria and its relationship with humans. “I appreciate the opportunity to pull together papers from different sources into a thematic collection and start a discussion around them,“ he commented.

Professor Gwyn Gould, at the Institute of Molecular, Cell and Systems Biology of the University of Glasgow has begun a collection “GLUT4 Biology” to open up a discussion on the regulation of fat and muscle cell glucose transport by insulin. With an estimated 387 million people suffering from diabetes, it is essential to understand the underlying biology. Professor Gould chose to create a ScienceOpen collection because “diabetes research draws upon work published in many different disciplines and distinct journals; keeping track of this can be tricky, especially for new graduate students. I plan to use this as a forum to initiate discussion with a community of scholars interested in this area, but with a particular desire to see graduate students join in and comment on articles of note, and to suggest their own contributions.”

Professor Bernd Fritzsch, Co-Director of the Aging Mind and Brain Initiative at the University of Iowa has created a collection on “Hearing Loss and Restoration“, a topic of increasing importance for our aging society. Hearing impairment is likely the most frequent ailment of the growing cohort of seniors worldwide.  While not immediately life-threatening, it cuts seniors off their established communication pattern with possibly serious consequences on mental health and social embedding.  Making an annually updated collection of relevant papers that help people see the gain in hearing loss prevention, repair and restoration will serve to align research goals with the needed community outreach of those suffering from this social impairment.

Dr. Johannes (Jan) Velterop has been involved in developing the concept of “Nanopublications” as a means to deal with information on a large scale, at least to construct an overview of the existing knowledge in a certain field and to find possible new connections or associations in the scientific literature that are implicit and have never been explicitly published as such. It is hoped that this approach will offer a way to ingest and digest the essential knowledge contained in large numbers of relevant scientific articles that are increasingly more of a burden and less of a possibility for researchers to read one by one.

“It is as simple as pick and choose,” says Alexander Grossmann, co-founder of ScienceOpen. “My own scholarly publishing collection has already attracted 25 000 researchers. It is terrific that I can now also track the aggregated social mentions.”

With many new collections soon to join these examples, we are excited about this expanding feature. To find out more about becoming a collection editor check out our information here or contact me (Stephanie.Dawson@ScienceOpen.com).

So pick and choose your apples and let’s make an apple pie for the holidays. Together we can change scientific communication to be faster, fairer, less expensive and more open!

 

In:  About SO  

Why I love ScienceOpen Search (and you should too!)

Image credit: Stephan Ohlsen_365 days 062 ropes _Flickr_CC BY NC SA
Image credit: Stephan Ohlsen_365 days 062 ropes _Flickr_CC BY NC SA

I want to share with you something cool that we have developed at ScienceOpen.

In my former life, as an editor working for a traditional scientific publisher, I had a broad overview of my subject area, but my level of expertise was not close to that of a practicing researcher working in the field. Every day I needed to answer questions like “Who is the most influential researcher in niche area X?”; “How does our recently published work stack up against similar articles Y?”; “Are people talking more about topic A or B?”.

Editors are not alone with these pressing questions. Everyone who searches for information in a field beyond their immediate expertise faces similar problems. In an Elsevier study 87% of researchers reported cross-disciplinary searching in new fields at least once a month.

So what was my solution at the time? Back then, in our small publishing house, a subscription to privately held scholarly databases that could run to ten or twenty thousand dollars, was just out of the question. We could make an educated guess; but knowledge is always preferable to guessing. So, we ended up taking the subway across town to use the major databases that were only available at the library. In those days, I would have done anything for a freely available open citation network that could tell me the top cited papers and authors across all publishers, recommend related articles, and show what topics are getting the most traction in the popular media.

What did I have to do to get my freely available open citation network? Together with the ScienceOpen Team WE BUILT ONE!! This tool is so awesome that I constantly have to stop myself from accosting strangers on the subway to tell them how much easier we just made their search experience. “Forget about the library,” in case they are on their way to access Web of Science or Scopus, “you can search from your home, office, or right now on your smart phone!”

So how does it work? ScienceOpen already covers over 10 million articles and is growing fast. Type in your search term and filter your results in a myriad of ways. Only articles published in the last two years? Easy. Only Open Access? Check. Even while using these criteria, a search for “Diabetes” brings back 13,053 results. Dilemma. What to read? Sort your results by “Cited by count”. The citation numbers don’t claim to be comprehensive, but they do provide an accurate picture of the relationships between citations on the site. And already, it’s made it easier for me to get a quick overview of what the community finds most important. I can also start asking questions like: why are some papers with an Altmetric score of over 500 cited 20 times, and other papers with an Altmetric score of 3 cited hundreds of times?

When I pick a paper to explore more deeply, ScienceOpen offers me the list of the paper references – sorted by citation number, a list of cited authors linked to their other publications in the network, and similar articles based on keywords and title. I can play with this tool all day. But if I need to find a reviewer, a collaborator, an author, an expert, then I am already well on my way. No more long subway rides to access privately held scholarly databases.

Try out this new ScienceOpen feature and tell a friend (but maybe not a stranger on the subway!).

Feel like giving us your feedback, take our survey or just get in touch with me at stephanie.dawson@scienceopen.com or @SDawsonBerlin on twitter.

Happy searching!

In:  Other  

Reflections of my trip to Shanghai – huge potential for OA

StephGWwborderI wrote this post on the plane back from my trip to Shanghai after a multiple day delay that (looking on the bright side) allowed me to see some of the sights courtesy of Hainan Airways!

I was invited to speak at the 3rd International Academic Publishing Forum on August 19th. Organized by the Shanghai Jiao Tong University press, the event brought together nearly 60 Chinese University Presses and representatives from some Western academic publishers – Elsevier, Wiley, Springer, Sage, Brill and ScienceOpen –to discuss what we can learn from one another.

My most powerful impression was the high value China places on knowledge. Mr. Shulin Wu, Vice-Chairman of the Publishers Association of China said in his in his keynote speech that the government regards “knowledge production to be as important as mining or oil”. And China is set to surpass both the US and the EU in spending on research and development by 2020. Communicating this knowledge, therefore, also has a high priority and falls mainly to the university presses. Their main short-term goals expressed over the two days were internationalization and digitalization of their content, with language seen as the main hurdle. Certainly all had a plan for going global.

But some publishers, including myself, were already thinking beyond internationalization and digitalization to the next step in academic publishing. Jason Wu hit the nail on the head by describing Wiley’s process of transformation “from publishing business to global provider of knowledge and learning services.” Solutions for researchers must be digital, global, mobile, interdisciplinary (Bryan Davies of Elsevier quoted a study that found 44% of researchers look for information outside of their own field). And Open Access is a good place to start.

The Open Access business model for journal publishing is perfect for Chinese publishers who have until now been dependent on cooperation with Western publishers to get their authors heard. Chinese scientists who do world-class research can publish in “world-class” journals such as Science or Nature, but publishers here were asking the hard question of themselves – why are so few of those world-class journals published in China? While Open Access cannot itself address the problem of reputation, it can insure that research can be read immediately and globally, without a team of sales representatives on every continent. As essentially non-profit entities with a mission to communicate China’s research successes to the world they are uniquely situated. With access to so much outstanding research, I sincerely hope that Chinese publishers will embrace this opportunity.

Taking the Shanghai subway I can attest that young Chinese are constantly networking on their mobile devices. A scientific networking and research platform like ScienceOpen in China would have a good chance to catch the imagination of young scientists. But time will tell how open this generation will be allowed to be. During my stay the Chinese government shut down up to 50 online news websites and nearly 400 Weibo and WeChat accounts for spreading “rumours” of the recent chemical explosion which took 129 lives. Twitter, Facebook, Google and many other sites were blocked during my visit, which left me feeling rather cut off from the rest of the world.

It was a crazy week – from the crowds and flashing neon of Shanghai to the peaceful magnificence of the Great Wall. I came away with a sense of the huge potential in China and the feeling that China needs Open Access and the Open Access movement needs China.

A warm ScienceOpen welcome to two new journals from top German medical publisher, Thieme

Image credit: Stonetown hat stall by Gail Hampshire, Flickr, CC BY
Image credit: Stonetown hat stall by Gail Hampshire, Flickr, CC BYHere at ScienceOpen we wear a few different hats! We’re a gold Open Access (OA) publisher, a peer review reformer and a content aggregator.

Here at ScienceOpen we wear a few different hats! We’re a gold Open Access (OA) publisher, a peer review reformer and a content aggregator.

This week, with the London Book Fair 2015 about to start, we are celebrating publishers and societies by profiling the innovative ways that they are using our platform!

It gives us great pleasure to report how a top scientific union and a major medical publisher (see below) are now using our platform to give their OA content increased visibility and facilitate scientific discussion.

With 1.5 million OA articles and a high performance search engine on ScienceOpen, users can slice and dice the content as they like. And often that selection criteria may be a trusted publisher or innovative journal. ScienceOpen is making that easy! With ScienceOpen Collections we’re able to highlight the articles of publishers and societies. Other innovative ways to use the Collection Tool  are discussed in this blog post.

For the first time, mirror versions of two new OA journals – the American Journal of Perinatology Reports and The Thoracic and Cardiovascular Surgeons Reports (partners to the American Journal of Perinatology and The Thoracic and Cardiovascular Surgeon) – have been re-created on our platform.  This allows them to be fully integrated into the scientific conversation. Both journals are published by Thieme, an award-winning international medical and science publisher.

You can find the American Journal of Perinatology Reports and The Thoracic and Cardiovascular Surgeons Report Collections on ScienceOpen under the publisher Thieme. These medical case reports are now available for commenting, sharing, and Post-Publication Peer Review (PPPR), by experts with 5 publications on their ORCID, as are all the articles aggregated on our site. The great thing: every review receives a CrossRef DOI so each contribution can be found and cited. We believe that this is a fantastic way to credit the important work of reviewers, too!

ScienceOpen CEO Stephanie Dawson, presented this concept at the Scientific Publishing Innovation Day organized by the Frankfurter Buchmesse in London on April 13th, just before the London Book Fair.

March Conference Madness – Poster Publishing Dilemmas

Image credit: LuMaxArt, Flickr, CC BY-SA
Image credit: LuMaxArt, Flickr, CC BY-SA

It’s March and so naturally the upcoming whirlwind of large scholarly conferences is on my mind. If I was still in the USA, I might also be participating in a friendly Basketball bet!

I recently attended the 5th International Conference of the Flow Chemistry Society in Berlin. It was expertly organized by SelectBio and featured everything that we expect from a scholarly conference – top scientists as keynote speakers, a poster session for Earlier Career Researchers to present preliminary data and, most importantly, coffee breaks to raise our energy so that we can exchange ideas with other participants.

But one innovation struck me: each participating poster exhibitor had been offered the opportunity to publish their poster via e-Posters. Because ScienceOpen also offers poster publishing (now free of charge), I was interested to exchange experiences with them. I had a great talk with Sara Spencer about how poster publishing can support researchers by encouraging discussion of their work after the conference or with colleagues who were not able to attend. Publishing them on a platform that provides each one with a DOI, as we do here at ScienceOpen, also means that the author can be credited if the poster is, for example, photographed and shared on Facebook.

However, both Sara and I have also observed that scientists are sometimes hesitant to “publish” their posters at all which surprised us since the benefits seem clear. The two most frequent questions about poster publishing that we encounter are:

  1. What is the advantage of publishing my poster?

Some posters get hung in the department hallway but most end their lives rolled up under a desk somewhere. By making your poster digitally available beyond its physical presence at a conference, you can extend the discussion of your research and possibly even find new collaborators. Of course, you can also do this by posting it on your website or in a repository. But by publishing it under a CC BY license and with a CrossRef registered DOI, you also make it possible to track the impact it has by recording altmetrics such as downloads, social shares etc – making it a much more valuable asset for your CV.

  1. This is preliminary research, can I publish these results later as a research article?

Most publishers recognize that science cannot move forward in a communication vacuum and rules around sharing are changing with the rise of online discussion forums. No one is quite sure where the new lines on such issues will be drawn. Scientists regularly share their preliminary research at conferences in the form of talks and posters or on pre-print servers such as arXiv or BiorXiv. Early feedback can save a researcher time and funding dollars.

The scientific community understands that there is a big difference between preliminary results presented in a pre-print or a poster and a full research paper. Most journal editors also have no problem making this distinction. A list of the pre-print policies of major academic journals can be found on Wikipedia. A list of how different journals view F1000Posters (and most do not regard them as pre-publication) can be found here.

However, it’s important to know that some journals do still regard posters as prior-publication and these include some big names such as the journals of the American Chemical Society; Royal Society of Chemistry; American Physiological Society; American Microbiology Society and the NEJM. When we contacted some poster session organizers at a large society conference about the possibility of publishing this content with DOI on ScienceOpen, one of them checked back in with the Society for their view and received this ominous warning:

We would caution you, and we would ask you to caution your presenters, that intellectual property rights issues, such as patent or other proprietary concerns, may be implicated by agreeing to the publication of posters.

Our answer to the above statement is to ask “how so?” Whether the author retains copyright and grants a CC license to publish or gives copyright to the publisher, then how is the IP of a poster different from that of an article? If they mean, as stated on the F1000Poster list, that they consider the limited and often preliminary content displayed on posters from Earlier Career Researchers to be prior publication then we say “good luck with that view in the digital age”!

What seems more likely to us is that large traditional publishers are using the same IP “scare tactics” that we last saw in the early days of Open Access. What they are trying to do is discourage poster or pre-print publishing (per their restrictive policies on live tweeting at conferences) with DOI because they don’t want these citations to lower the Impact Factors of their journals.

The scientific community is beginning to experiment with the new tools for sharing and networking online and this is putting pressure on established structures and rules. To them we say:

Be sure to publish your posters or pre-prints with a DOI so they can be found and cited. Then publish your subsequent full article with organizations that have progressive policies on prior-sharing, preferably Open Access!

In:  Impact Factor  

Article vs Journal Impact – Perspective from PLOS ONE Editorial Director Damian Pattinson

The Hellas Impact Basin on Mars (edited topographical map), which may be the largest crater in the solar system.  Credit: Stuart Rankin, Flickr, CC-BY-NC
The Hellas Impact Basin on Mars (edited topographical map), which may be the largest crater in the solar system. Credit: Stuart Rankin, Flickr, CC-BY-NC

Earlier this summer, I skyped with Damian Pattinson, the Editorial Director of PLOS ONE, about the Impact Factor , its widespread misuse and how, thankfully, altmetrics now offer a better way forward.

Q. The PLOS ONE Impact Factor has decreased for a few years in a row. Is this to be expected given its ranking as the world’s largest journal and remit to publish all good science regardless of impact?

A. I don’t think the Impact Factor is a very good measure of anything, but clearly it is particularly meaningless for a journal that deliberately eschews evaluation of impact in its publications decisions. Our founding principle was that impact should be evaluated post-publication. In terms of the average number of citations per article, my sense is that this is changing due to the expanding breadth of fields covered by PLOS ONE, not to mention its sheer size (we recently published our 100,000th article). When you grow as quickly as we have, your annual average citation rate will always be suppressed by the fact that you are publishing far more papers at the end of the year than at the beginning.

Q. Articles at PLOS ONE undoubtedly vary in terms of the number of citations they accrue. Some are higher, some lower. Is there an observable pattern to this trend overall that is not reflected by a simple read of the Impact Factor?

A. Differences in the average number of citations are, to a large extent, subject specific and therefore a reflection on the size of a particular research community. Smaller fields simply produce fewer scientific papers so statistically it is less likely that even a highly-cited paper will have as many citations as one published in a larger research field. Such a subject-specific examination may also reveal different patterns if one looks at metrics besides citation. That is something we are very interested in exploring with Article-Level Metrics (ALM).

Q. Has the reduction of PLOS ONE’s Impact Factor influenced its submission volume or is that holding up relatively well?

A. Actually, the effective submission volume is still increasing even though the rate of growth has slowed. Year-on-year doubling in perpetuity is not realistic in any arena. We have seen a drop in the number of publications, however, due to a number of factors. Most notably we have seen an increase in the rejection rate as we continue to ensure that the research published in PLOS ONE is of the highest standard. We put all our papers through rigorous checks at submission, including ethical oversight, data availability, adherence to reporting guidelines, and so more papers are rejected before being sent for review.  We have also found an increase of submissions better suited for other dissemination channels, and have worked with authors to pursue them. But to your point, I do not think that last year’s changing IF directly affected PLOS ONE submission volume.

Q. Stepping back for a moment, it really is extraordinary that this arguably flawed mathematical equation, first mentioned by Dr Eugene Garfield in 1955, is still so influential. Garfield said “The impact factor is a very useful tool for evaluation of journals, but it must be used discreetly”.

It seems that the use of the IF is far from discreet since it is a prime marketing tool for many organizations, although not at PLOS which doesn’t list the IF on any of its websites (kudos). But seriously, do you agree with Garfield’s statement that the IF has any merit in journal evaluation, or that evaluating journals at all in the digital age has any merit?

A. Any journal level metric is going to be problematic as “journals” continue to evolve in a digital environment. But the IF is particularly questionable as a tool to measure the “average” citation rates of a journal because the distribution is hardly ever normal – in most journals a few highly cited papers contribute to most of the IF while a great number of papers are hardly cited at all. The San Francisco Declaration on Research Assessment (DORA) is a great first step in moving away from using journal metrics to measure things they were never intended to measure and I recommend everyone to sign it.

Q. What are the main ways that the IF is misused, in your opinion?

A. The level to which the IF has become entrenched in the scientific community is amazing. Grants, tenure, hiring at nearly every level depend to the IF of the journals in which a researcher publishes his or her results. Nearly everyone realizes that it is not a good way to measure quality or productivity, but use it anyway. Actually it’s more complicated than that – everyone uses it because they think that everyone else cares about it! So academics believe that their institutions use it to decide tenure, even when the institutions have committed not to; institutions think that the funders care about it despite commitments to the contrary.  In some way the community itself needs to reflect on this and make some changes. The IF creates perverse incentives for the entire research community, including publishers. Of course journals try to improve their score, often in ways that is damaging to the research community. Because of how the IF is calculated, it makes sense to publish high impact papers in January so that they collect citations for the full 12 months. Some journals hold back the best papers for months to increase the IF – which is bad for both the researchers as well as the whole of science. Journals also choose to publish papers that may be less useful to researchers simply because they are more highly cited. So they will choose to publish (often unnecessary) review articles, while refusing to publish negative results or case reports, which will be cited less often (despite offering more useful information).

Q. Could you imagine another metric which would better measure the output of journals like PLOS ONE?

A. Of course you are right, for journals that cover a broad range of disciplines or for interdisciplinary journals, the Impact Factor is even less useful because of the subject-specific statistics we spoke of earlier. There have been a number of newcomers such as ScienceOpen, PeerJ and F1000Research with a very broad scope – as these and other new platforms come into the publishing mainstream, we may find new metrics to distinguish strengths and weaknesses. Certainly the Impact Factor is not the best mechanism for journal quality and, even less so, researcher quality.

Q. How do you feel about ScienceOpen Advisory Board Member Peter Suber’s statement in a recent ScienceOpen interview that the IF is “an impact metric used as a quality metric, but it doesn’t measure impact well and doesn’t measure quality at all.”

A. How often a paper is cited in the scholarly literature is an important metric. But citations are a blunt tool at best to measure research quality. We do not know anything about the reason a paper was cited – it could be in fact to refute a point or as an example of incorrect methodology. If we only focus on citations, we are missing a more interesting and powerful story. With ALMs that also measure downloads, social media usage, recommendations, and more, we find huge variations in usage. In fields beyond basic research such as clinical medicine or applied technology fields which have implications for the broader population, a paper may have a big political impact, even though it is not highly cited. ALMs are really starting to show us the different ways different articles are received. At the moment we do not have a good measure of quality, but we believe reproducibility of robust results are key.

At PLOS we have been at the forefront of this issue for many years, and are continuing to innovate to find better ways of measuring and improving reproducibility of the literature. With our current focus on “impact” we are disproportionately rewarding the “biggest story” which may have an inverse relationship to reproducibility and quality.

Q. PLOS has a leadership role within the Altmetrics community. To again quote ScienceOpen Advisory Board Member Peter Suber on the current state of play: “Smart people are competing to give us the most nuanced or sensitive picture of research impact they can. We all benefit from that.”

Did PLOS predict the level to which the field has taken off and the amount of competition within it or is the organization pleasantly surprised?

A. The need was clearly there and only increasing over time. When we began our Article-Level Metrics (ALM) work in 2009, we envisioned a better system for all scholars. This is certainly not something specific to Open Access.

Since then, the particular details of how we might better serve science continue to evolve, especially now that the entire community has begun to actively engage with these issues together. It’s great that there is increasing awareness that the expanding suite of article activity metrics cannot fully come of age until data are made freely available for all scholarly literature and widely adopted. Only then can we better understand what the numbers truly mean in order to appropriately apply them. We anticipate that open availability of data will usher in an entire vibrant sector of technology providers that each add value to the metrics in novel ways. We are seeing very promising developments in this direction already.

Q. What’s next for PLOS ALM in terms of new features and developments?

A. Our current efforts are primarily focused on developing the ALM application to serve the needs not only of single publishers but of the entire research ecosystem. We are thrilled too that the community is increasingly participating in this vision, as the application grows into a bona fide open source community project with collaborators across the publishing sector, including CrossRef. On the home front, the application is essentially an open framework that can capture activity on scholarly objects beyond the article, and we’ll be exploring this further with research datasets. Furthermore, we will be overhauling the full display of ALM on the article page metrics tab with visualizations that tell the story of article activity across time and across ALM sources.  We will also release a set of enhancements to ALM Reports so that it better supports the wide breadth of reporting needs for researchers, funders, and institutions.

ScienceOpen Author Interview Series – Daniel Graziotin

ScienceOpen Author Interview Series – Daniel Graziotin

Today’s interview comes from another recent ScienceOpen author, Daniel Graziotin. In his ScienceOpen article, “Green open access in computer science – an exploratory study on author-based self-archiving awareness, practice, and inhibitors,” he analyses the results of an exploratory questionnaire given to Computer Scientists. It addressed issues around various forms of academic publishing, self-archiving of research, and copyright. In the following interview with ScienceOpen, Graziotin gives a unique perspective, as a young scientist and as a software and web developer coming to scientific publishing from the world of open software development. His ideas are bound to be interesting to emerging scientists in particular, as he represents a new globally engaged generation of scientific researchers making full use of open knowledge to further innovation. Continue reading “ScienceOpen Author Interview Series – Daniel Graziotin”  

  Previous page
123